MAKE YOUR VOICE HEARD ON THE DIGITAL SERVICES ACT (DSA)

On 15 December 2020, the European Commission presented its proposal for a new Digital Services Act (DSA). In the past years, online platforms have gained the power to impact our fundamental rights, our society and democracy. The DSA presents a chance to give people more rights and freedoms and start building a better internet, with clear rules for take-downs of illegal content, more transparency and choice for users.

Please help us improve the proposal by contributing comments and suggestions on the Commission proposal. Your contribution will be valuable for our work and amendments in the EU Parliament. The discussion will close on Sunday, 23 May at midnight. Our teams remain at your disposal for any questions or further comments at joseph.mcnamee@europarl.europa.eu.

You can also comment on the proposed Digital Markets Act, please click here.

Two ways to give feeback:

  1. You can leave a general remark concernig the text as whole here.
  2. You can amend single paragraphs using the plus icons. Furthermore, you can comment while reading (and don’t have to scroll to the very bottom). You can even discuss existing annotations.

Proposal for a 

REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL 

on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC 

THE EUROPEAN PARLIAMENT AND THE COUNCIL OF THE EUROPEAN UNION,

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 114 thereof,

Having regard to the proposal from the European Commission,

After transmission of the draft legislative act to the national parliaments,

Having regard to the opinion of the European Economic and Social Committee,

Having regard to the opinion of the Committee of the Regions,

Having regard to the opinion of the European Data Protection Supervisor,

Acting in accordance with the ordinary legislative procedure,
Whereas:

  1. Information society services and especially intermediary services have become an important part of the Union’s economy and daily life of Union citizens. Twenty years after the adoption of the existing legal framework applicable to such services laid down in Directive 2000/31/EC of the European Parliament and of the Council, new and innovative business models and services, such as online social networks and marketplaces, have allowed business users and consumers to impart and access information and engage in transactions in novel ways. A majority of Union citizens now uses those services on a daily basis. However, the digital transformation and increased use of those services has also resulted in new risks and challenges, both for individual users and for society as a whole.
  2. Member States are increasingly introducing, or are considering introducing, national laws on the matters covered by this Regulation, imposing, in particular, diligence requirements for providers of intermediary services. Those diverging national laws negatively affect the internal market, which, pursuant to Article 26 of the Treaty, comprises an area without internal frontiers in which the free movement of goods and services and freedom of establishment are ensured, taking into account the inherently cross-border nature of the internet, which is generally used to provide those services. The conditions for the provision of intermediary services across the internal market should be harmonised, so as to provide businesses with access to new markets and opportunities to exploit the benefits of the internal market, while allowing consumers and other recipients of the services to have increased choice. 
  3. Responsible and diligent behaviour by providers of intermediary services is essential for a safe, predictable and trusted online environment and for allowing Union citizens and other persons to exercise their fundamental rights guaranteed in the Charter of Fundamental Rights of the European Union (‘Charter’), in particular the freedom of expression and information and the freedom to conduct a business, and the right to non-discrimination. 
  4. Therefore, in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary in order to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated. 
  5. This Regulation should apply to providers of certain information society services as defined in Directive (EU) 2015/1535 of the European Parliament and of the Council, that is, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient. Specifically, this Regulation should apply to providers of intermediary services, and in particular intermediary services consisting of services known as ‘mere conduit’, ‘caching’ and ‘hosting’ services, given that the exponential growth of the use made of those services, mainly for legitimate and socially beneficial purposes of all kinds, has also increased their role in the intermediation and spread of unlawful or otherwise harmful information and activities. 
  6. In practice, certain providers of intermediary services intermediate in relation to services that may or may not be provided by electronic means, such as remote information technology services, transport, accommodation or delivery services. This Regulation should apply only to intermediary services and not affect requirements set out in Union or national law relating to products or services intermediated through intermediary services, including in situations where the intermediary service constitutes an integral part of another service which is not an intermediary service as specified in the case law of the Court of Justice of the European Union. 
  7. In order to ensure the effectiveness of the rules laid down in this Regulation and a level playing field within the internal market, those rules should apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the Union, as evidenced by a substantial connection to the Union. 
  8. Such a substantial connection to the Union should be considered to exist where the service provider has an establishment in the Union or, in its absence, on the basis of the existence of a significant number of users in one or more Member States, or the targeting of activities towards one or more Member States. The targeting of activities towards one or more Member States can be determined on the basis of all relevant circumstances, including factors such as the use of a language or a currency generally used in that Member State, or the possibility of ordering products or services, or using a national top level domain. The targeting of activities towards a Member State could also be derived from the availability of an application in the relevant national application store, from the provision of local advertising or advertising in the language used in that Member State, or from the handling of customer relations such as by providing customer service in the language generally used in that Member State. A substantial connection should also be assumed where a service provider directs its activities to one or more Member State as set out in Article 17(1)(c) of Regulation (EU) 1215/2012 of the European Parliament and of the Council. On the other hand, mere technical accessibility of a website from the Union cannot, on that ground alone, be considered as establishing a substantial connection to the Union. 
  9. This Regulation should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services, in particular Directive 2000/31/EC, with the exception of those changes introduced by this Regulation, Directive 2010/13/EU of the European Parliament and of the Council as amended, and Regulation (EU) …/.. of the European Parliament and of the Council – proposed Terrorist Content Online Regulation. Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level. 
  10. For reasons of clarity, it should also be specified that this Regulation is without prejudice to Regulation (EU) 2019/1148 of the European Parliament and of the Council and Regulation (EU) 2019/1150 of the European Parliament and of the Council, Directive 2002/58/EC of the European Parliament and of the Council and Regulation […/…] on temporary derogation from certain provisions of Directive 2002/58/EC as well as Union law on consumer protection, in particular Directive 2005/29/EC of the European Parliament and of the Council, Directive 2011/83/EU of the European Parliament and of the Council and Directive 93/13/EEC of the European Parliament and of the Council, as amended by Directive (EU) 2019/2161 of the European Parliament and of the Council37, and on the protection of personal data, in particular Regulation (EU) 2016/679 of the European Parliament and of the Council. The protection of individuals with regard to the processing of personal data is solely governed by the rules of Union law on that subject, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. This Regulation is also without prejudice to the rules of Union law on working conditions. 
  11. It should be clarified that this Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected. 
  12. In order to achieve the objective of ensuring a safe, predictable and trusted online environment, for the purpose of this Regulation the concept of “illegal content” should be defined broadly and also covers information relating to illegal content, products, services and activities. In particular, that concept should be understood to refer to information, irrespective of its form, that under the applicable law is either itself illegal, such as illegal hate speech or terrorist content and unlawful discriminatory content, or that relates to activities that are illegal, such as the sharing of images depicting child sexual abuse, unlawful non-consensual sharing of private images, online stalking, the sale of non-compliant or counterfeit products, the non-authorised use of copyright protected material or activities involving infringements of consumer protection law. In this regard, it is immaterial whether the illegality of the information or activity results from Union law or from national law that is consistent with Union law and what the precise nature or subject matter is of the law in question. 
  13. Considering the particular characteristics of the services concerned and the corresponding need to make the providers thereof subject to certain specific obligations, it is necessary to distinguish, within the broader category of providers of hosting services as defined in this Regulation, the subcategory of online platforms. Online platforms, such as social networks or online marketplaces, should be defined as providers of hosting services that not only store information provided by the recipients of the service at their request, but that also disseminate that information to the public, again at their request. However, in order to avoid imposing overly broad obligations, providers of hosting services should not be considered as online platforms where the dissemination to the public is merely a minor and purely ancillary feature of another service and that feature cannot, for objective technical reasons, be used without that other, principal service, and the integration of that feature is not a means to circumvent the applicability of the rules of this Regulation applicable to online platforms. For example, the comments section in an online newspaper could constitute such a feature, where it is clear that it is ancillary to the main service represented by the publication of news under the editorial responsibility of the publisher. 
  14. The concept of ‘dissemination to the public’, as used in this Regulation, should entail the making available of information to a potentially unlimited number of persons, that is, making the information easily accessible to users in general without further action by the recipient of the service providing the information being required, irrespective of whether those persons actually access the information in question. The mere possibility to create groups of users of a given service should not, in itself, be understood to mean that the information disseminated in that manner is not disseminated to the public. However, the concept should exclude dissemination of information within closed groups consisting of a finite number of pre-determined persons. Interpersonal communication services, as defined in Directive (EU) 2018/1972 of the European Parliament and of the Council, such as emails or private messaging services, fall outside the scope of this Regulation. Information should be considered disseminated to the public within the meaning of this Regulation only where that occurs upon the direct request by the recipient of the service that provided the information. 
  15. Where some of the services provided by a provider are covered by this Regulation whilst others are not, or where the services provided by a provider are covered by different sections of this Regulation, the relevant provisions of this Regulation should apply only in respect of those services that fall within their scope. 
  16. The legal certainty provided by the horizontal framework of conditional exemptions from liability for providers of intermediary services, laid down in Directive 2000/31/EC, has allowed many novel services to emerge and scale-up across the internal market. That framework should therefore be preserved. However, in view of the divergences when transposing and applying the relevant rules at national level, and for reasons of clarity and coherence, that framework should be incorporated in this Regulation. It is also necessary to clarify certain elements of that framework, having regard to case law of the Court of Justice of the European Union. 
  17. The relevant rules of Chapter II should only establish when the provider of intermediary services concerned cannot be held liable in relation to illegal content provided by the recipients of the service. Those rules should not be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine. Furthermore, the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws. 
  18. The exemptions from liability established in this Regulation should not apply where, instead of confining itself to providing the services neutrally, by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of intermediary service itself, including where the information has been developed under the editorial responsibility of that provider. 
  19. In view of the different nature of the activities of ‘mere conduit’, ‘caching’ and ‘hosting’ and the different position and abilities of the providers of the services in question, it is necessary to distinguish the rules applicable to those activities, in so far as under this Regulation they are subject to different requirements and conditions and their scope differs, as interpreted by the Court of Justice of the European Union. 
  20. A provider of intermediary services that deliberately collaborates with a recipient of the services in order to undertake illegal activities does not provide its service neutrally and should therefore not be able to benefit from the exemptions from liability provided for in this Regulation. 
  21. A provider should be able to benefit from the exemptions from liability for ‘mere conduit’ and for ‘caching’ services when it is in no way involved with the information transmitted. This requires, among other things, that the provider does not modify the information that it transmits. However, this requirement should not be understood to cover manipulations of a technical nature which take place in the course of the transmission, as such manipulations do not alter the integrity of the information transmitted. 
  22. In order to benefit from the exemption from liability for hosting services, the provider should, upon obtaining actual knowledge or awareness of illegal content, act expeditiously to remove or to disable access to that content. The removal or disabling of access should be undertaken in the observance of the principle of freedom of expression. The provider can obtain such actual knowledge or awareness through, in particular, its own-initiative investigations or notices submitted to it by individuals or entities in accordance with this Regulation in so far as those notices are sufficiently precise and adequately substantiated to allow a diligent economic operator to reasonably identify, assess and where appropriate act against the allegedly illegal content. 
  23. In order to ensure the effective protection of consumers when engaging in intermediated commercial transactions online, certain providers of hosting services, namely, online platforms that allow consumers to conclude distance contracts with traders, should not be able to benefit from the exemption from liability for hosting service providers established in this Regulation, in so far as those online platforms present the relevant information relating to the transactions at issue in such a way that it leads consumers to believe that the information was provided by those online platforms themselves or by recipients of the service acting under their authority or control, and that those online platforms thus have knowledge of or control over the information, even if that may in reality not be the case. In that regard, is should be determined objectively, on the basis of all relevant circumstances, whether the presentation could lead to such a belief on the side of an average and reasonably well-informed consumer. 
  24. The exemptions from liability established in this Regulation should not affect the possibility of injunctions of different kinds against providers of intermediary services, even where they meet the conditions set out as part of those exemptions. Such injunctions could, in particular, consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal content specified in such orders, issued in compliance with Union law, or the disabling of access to it. 
  25. In order to create legal certainty and not to discourage activities aimed at detecting, identifying and acting against illegal content that providers of intermediary services may undertake on a voluntary basis, it should be clarified that the mere fact that providers undertake such activities does not lead to the unavailability of the exemptions from liability set out in this Regulation, provided those activities are carried out in good faith and in a diligent manner. In addition, it is appropriate to clarify that the mere fact that those providers take measures, in good faith, to comply with the requirements of Union law, including those set out in this Regulation as regards the implementation of their terms and conditions, should not lead to the unavailability of those exemptions from liability. Therefore, any such activities and measures that a given provider may have taken should not be taken into account when determining whether the provider can rely on an exemption from liability, in particular as regards whether the provider provides its service neutrally and can therefore fall within the scope of the relevant provision, without this rule however implying that the provider can necessarily rely thereon. 
  26. Whilst the rules in Chapter II of this Regulation concentrate on the exemption from liability of providers of intermediary services, it is important to recall that, despite the generally important role played by those providers, the problem of illegal content and activities online should not be dealt with by solely focusing on their liability and responsibilities. Where possible, third parties affected by illegal content transmitted or stored online should attempt to resolve conflicts relating to such content without involving the providers of intermediary services in question. Recipients of the service should be held liable, where the applicable rules of Union and national law determining such liability so provide, for the illegal content that they provide and may disseminate through intermediary services. Where appropriate, other actors, such as group moderators in closed online environments, in particular in the case of large groups, should also help to avoid the spread of illegal content online, in accordance with the applicable law. Furthermore, where it is necessary to involve information society services providers, including providers of intermediary services, any requests or orders for such involvement should, as a general rule, be directed to the actor that has the technical and operational ability to act against specific items of illegal content, so as to prevent and minimise any possible negative effects for the availability and accessibility of information that is not illegal content. 
  27. Since 2000, new technologies have emerged that improve the availability, efficiency, speed, reliability, capacity and security of systems for the transmission and storage of data online, leading to an increasingly complex online ecosystem. In this regard, it should be recalled that providers of services establishing and facilitating the underlying logical architecture and proper functioning of the internet, including technical auxiliary functions, can also benefit from the exemptions from liability set out in this Regulation, to the extent that their services qualify as ‘mere conduits’, ‘caching’ or hosting services. Such services include, as the case may be, wireless local area networks, domain name system (DNS) services, top–level domain name registries, certificate authorities that issue digital certificates, or content delivery networks, that enable or improve the functions of other providers of intermediary services. Likewise, services used for communications purposes, and the technical means of their delivery, have also evolved considerably, giving rise to online services such as Voice over IP, messaging services and web-based e-mail services, where the communication is delivered via an internet access service. Those services, too, can benefit from the exemptions from liability, to the extent that they qualify as ‘mere conduit’, ‘caching’ or hosting service. 
  28. Providers of intermediary services should not be subject to a monitoring obligation with respect to obligations of a general nature. This does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation, in accordance with the conditions established in this Regulation. Nothing in this Regulation should be construed as an imposition of a general monitoring obligation or active fact-finding obligation, or as a general obligation for providers to take proactive measures to relation to illegal content. 
  29. Depending on the legal system of each Member State and the field of law at issue, national judicial or administrative authorities may order providers of intermediary services to act against certain specific items of illegal content or to provide certain specific items of information. The national laws on the basis of which such orders are issued differ considerably and the orders are increasingly addressed in cross-border situations. In order to ensure that those orders can be complied with in an effective and efficient manner, so that the public authorities concerned can carry out their tasks and the providers are not subject to any disproportionate burdens, without unduly affecting the rights and legitimate interests of any third parties, it is necessary to set certain conditions that those orders should meet and certain complementary requirements relating to the processing of those orders. 
  30. Orders to act against illegal content or to provide information should be issued in compliance with Union law, in particular Regulation (EU) 2016/679 and the prohibition of general obligations to monitor information or to actively seek facts or circumstances indicating illegal activity laid down in this Regulation. The conditions and requirements laid down in this Regulation which apply to orders to act against illegal content are without prejudice to other Union acts providing for similar systems for acting against specific types of illegal content, such as Regulation (EU) …/…. [proposed Regulation addressing the dissemination of terrorist content online], or Regulation (EU) 2017/2394 that confers specific powers to order the provision of information on Member State consumer law enforcement authorities, whilst the conditions and requirements that apply to orders to provide information are without prejudice to other Union acts providing for similar relevant rules for specific sectors. Those conditions and requirements should be without prejudice to retention and preservation rules under applicable national law, in conformity with Union law and confidentiality requests by law enforcement authorities related to the non-disclosure of information. 
  31. The territorial scope of such orders to act against illegal content should be clearly set out on the basis of the applicable Union or national law enabling the issuance of the order and should not exceed what is strictly necessary to achieve its objectives. In that regard, the national judicial or administrative authority issuing the order should balance the objective that the order seeks to achieve, in accordance with the legal basis enabling its issuance, with the rights and legitimate interests of all third parties that may be affected by the order, in particular their fundamental rights under the Charter. In addition, where the order referring to the specific information may have effects beyond the territory of the Member State of the authority concerned, the authority should assess whether the information at issue is likely to constitute illegal content in other Member States concerned and, where relevant, take account of the relevant rules of Union law or international law and the interests of international comity. 
  32. The orders to provide information regulated by this Regulation concern the production of specific information about individual recipients of the intermediary service concerned who are identified in those orders for the purposes of determining compliance by the recipients of the services with applicable Union or national rules. Therefore, orders about information on a group of recipients of the service who are not specifically identified, including orders to provide aggregate information required for statistical purposes or evidence-based policy-making, should remain unaffected by the rules of this Regulation on the provision of information. 
  33. Orders to act against illegal content and to provide information are subject to the rules safeguarding the competence of the Member State where the service provider addressed is established and laying down possible derogations from that competence in certain cases, set out in Article 3 of Directive 2000/31/EC, only if the conditions of that Article are met. Given that the orders in question relate to specific items of illegal content and information, respectively, where they are addressed to providers of intermediary services established in another Member State, they do not in principle restrict those providers’ freedom to provide their services across borders. Therefore, the rules set out in Article 3 of Directive 2000/31/EC, including those regarding the need to justify measures derogating from the competence of the Member State where the service provider is established on certain specified grounds and regarding the notification of such measures, do not apply in respect of those orders. 
  34. In order to achieve the objectives of this Regulation, and in particular to improve the functioning of the internal market and ensure a safe and transparent online environment, it is necessary to establish a clear and balanced set of harmonised due diligence obligations for providers of intermediary services. Those obligations should aim in particular to guarantee different public policy objectives such as the safety and trust of the recipients of the service, including minors and vulnerable users, protect the relevant fundamental rights enshrined in the Charter, to ensure meaningful accountability of those providers and to empower recipients and other affected parties, whilst facilitating the necessary oversight by competent authorities. 
  35. In that regard, it is important that the due diligence obligations are adapted to the type and nature of the intermediary service concerned. This Regulation therefore sets out basic obligations applicable to all providers of intermediary services, as well as additional obligations for providers of hosting services and, more specifically, online platforms and very large online platforms. To the extent that providers of intermediary services may fall within those different categories in view of the nature of their services and their size, they should comply with all of the corresponding obligations of this Regulation. Those harmonised due diligence obligations, which should be reasonable and non-arbitrary, are needed to achieve the identified public policy concerns, such as safeguarding the legitimate interests of the recipients of the service, addressing illegal practices and protecting fundamental rights online.  
  36. In order to facilitate smooth and efficient communications relating to matters covered by this Regulation, providers of intermediary services should be required to establish a single point of contact and to publish relevant information relating to their point of contact, including the languages to be used in such communications. The point of contact can also be used by trusted flaggers and by professional entities which are under a specific relationship with the provider of intermediary services. In contrast to the legal representative, the point of contact should serve operational purposes and should not necessarily have to have a physical location . 
  37. Providers of intermediary services that are established in a third country that offer services in the Union should designate a sufficiently mandated legal representative in the Union and provide information relating to their legal representatives, so as to allow for the effective oversight and, where necessary, enforcement of this Regulation in relation to those providers. It should be possible for the legal representative to also function as point of contact, provided the relevant requirements of this Regulation are complied with. 
  38. Whilst the freedom of contract of providers of intermediary services should in principle be respected, it is appropriate to set certain rules on the content, application and enforcement of the terms and conditions of those providers in the interests of transparency, the protection of recipients of the service and the avoidance of unfair or arbitrary outcomes. 
  39. To ensure an adequate level of transparency and accountability, providers of intermediary services should annually report, in accordance with the harmonised requirements contained in this Regulation, on the content moderation they engage in, including the measures taken as a result of the application and enforcement of their terms and conditions. However, so as to avoid disproportionate burdens, those transparency reporting obligations should not apply to providers that are micro- or small enterprises as defined in Commission Recommendation 2003/361/EC. 
  40. Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned (‘notice’), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content (‘action’). Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in as far as they qualify as providers of hosting services covered by this Regulation. 
  41. The rules on such notice and action mechanisms should be harmonised at Union level, so as to provide for the timely, diligent and objective processing of notices on the basis of rules that are uniform, transparent and clear and that provide for robust safeguards to protect the right and legitimate interests of all affected parties, in particular their fundamental rights guaranteed by the Charter, irrespective of the Member State in which those parties are established or reside and of the field of law at issue. The fundamental rights include, as the case may be, the right to freedom of expression and information, the right to respect for private and family life, the right to protection of personal data, the right to non-discrimination and the right to an effective remedy of the recipients of the service; the freedom to conduct a business, including the freedom of contract, of service providers; as well as the right to human dignity, the rights of the child, the right to protection of property, including intellectual property, and the right to non-discrimination of parties affected by illegal content. 
  42. Where a hosting service provider decides to remove or disable information provided by a recipient of the service, for instance following receipt of a notice or acting on its own initiative, including through the use of automated means, that provider should inform the recipient of its decision, the reasons for its decision and the available redress possibilities to contest the decision, in view of the negative consequences that such decisions may have for the recipient, including as regards the exercise of its fundamental right to freedom of expression. That obligation should apply irrespective of the reasons for the decision, in particular whether the action has been taken because the information notified is considered to be illegal content or incompatible with the applicable terms and conditions. Available recourses to challenge the decision of the hosting service provider should always include judicial redress. 
  43. To avoid disproportionate burdens, the additional obligations imposed on online platforms under this Regulation should not apply to micro or small enterprises as defined in Recommendation 2003/361/EC of the Commission, unless their reach and impact is such that they meet the criteria to qualify as very large online platforms under this Regulation. The consolidation rules laid down in that Recommendation help ensure that any circumvention of those additional obligations is prevented. The exemption of micro- and small enterprises from those additional obligations should not be understood as affecting their ability to set up, on a voluntary basis, a system that complies with one or more of those obligations. 
  44. Recipients of the service should be able to easily and effectively contest certain decisions of online platforms that negatively affect them. Therefore, online platforms should be required to provide for internal complaint-handling systems, which meet certain conditions aimed at ensuring that the systems are easily accessible and lead to swift and fair outcomes. In addition, provision should be made for the possibility of out-of-court dispute settlement of disputes, including those that could not be resolved in satisfactory manner through the internal complaint-handling systems, by certified bodies that have the requisite independence, means and expertise to carry out their activities in a fair, swift and cost-effective manner. The possibilities to contest decisions of online platforms thus created should complement, yet leave unaffected in all respects, the possibility to seek judicial redress in accordance with the laws of the Member State concerned. 
  45. For contractual consumer-to-business disputes over the purchase of goods or services, Directive 2013/11/EU of the European Parliament and of the Council ensures that Union consumers and businesses in the Union have access to quality-certified alternative dispute resolution entities. In this regard, it should be clarified that the rules of this Regulation on out-of-court dispute settlement are without prejudice to that Directive, including the right of consumers under that Directive to withdraw from the procedure at any stage if they are dissatisfied with the performance or the operation of the procedure. 
  46. Action against illegal content can be taken more quickly and reliably where online platforms take the necessary measures to ensure that notices submitted by trusted flaggers through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and objective manner. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content, that they represent collective interests and that they work in a diligent and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and semi-public bodies, such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. For intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions. The rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status under this Regulation, from otherwise cooperating with other entities, in accordance with the applicable law, including this Regulation and Regulation (EU) 2016/794 of the European Parliament and of the Council. 
  47. The misuse of services of online platforms by frequently providing manifestly illegal content or by frequently submitting manifestly unfounded notices or complaints under the mechanisms and systems, respectively, established under this Regulation undermines trust and harms the rights and legitimate interests of the parties concerned. Therefore, there is a need to put in place appropriate and proportionate safeguards against such misuse. Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal respectively that the notices or complaints are unfounded. Under certain conditions, online platforms should temporarily suspend their relevant activities in respect of the person engaged in abusive behaviour. This is without prejudice to the freedom by online platforms to determine their terms and conditions and establish stricter measures in the case of manifestly illegal content related to serious crimes. For reasons of transparency, this possibility should be set out, clearly and in sufficiently detail, in the terms and conditions of the online platforms. Redress should always be open to the decisions taken in this regard by online platforms and they should be subject to oversight by the competent Digital Services Coordinator. The rules of this Regulation on misuse should not prevent online platforms from taking other measures to address the provision of illegal content by recipients of their service or other misuse of their services, in accordance with the applicable Union and national law. Those rules are without prejudice to any possibility to hold the persons engaged in misuse liable, including for damages, provided for in Union or national law. 
  48. An online platform may in some instances become aware, such as through a notice by a notifying party or through its own voluntary measures, of information relating to certain activity of a recipient of the service, such as the provision of certain types of illegal content, that reasonably justify, having regard to all relevant circumstances of which the online platform is aware, the suspicion that the recipient may have committed, may be committing or is likely to commit a serious criminal offence involving a threat to the life or safety of person, such as offences specified in Directive 2011/93/EU of the European Parliament and of the Council. In such instances, the online platform should inform without delay the competent law enforcement authorities of such suspicion, providing all relevant information available to it, including where relevant the content in question and an explanation of its suspicion. This Regulation does not provide the legal basis for profiling of recipients of the services with a view to the possible identification of criminal offences by online platforms. Online platforms should also respect other applicable rules of Union or national law for the protection of the rights and freedoms of individuals when informing law enforcement authorities. 
  49. In order to contribute to a safe, trustworthy and transparent online environment for consumers, as well as for other interested parties such as competing traders and holders of intellectual property rights, and to deter traders from selling products or services in violation of the applicable rules, online platforms allowing consumers to conclude distance contracts with traders should ensure that such traders are traceable. The trader should therefore be required to provide certain essential information to the online platform, including for purposes of promoting messages on or offering products. That requirement should also be applicable to traders that promote messages on products or services on behalf of brands, based on underlying agreements. Those online platforms should store all information in a secure manner for a reasonable period of time that does not exceed what is necessary, so that it can be accessed, in accordance with the applicable law, including on the protection of personal data, by public authorities and private parties with a legitimate interest, including through the orders to provide information referred to in this Regulation. 
  50. To ensure an efficient and adequate application of that obligation, without imposing any disproportionate burdens, the online platforms covered should make reasonable efforts to verify the reliability of the information provided by the traders concerned, in particular by using freely available official online databases and online interfaces, such as national trade registers and the VAT Information Exchange System, or by requesting the traders concerned to provide trustworthy supporting documents, such as copies of identity documents, certified bank statements, company certificates and trade register certificates. They may also use other sources, available for use at a distance, which offer a similar degree of reliability for the purpose of complying with this obligation. However, the online platforms covered should not be required to engage in excessive or costly online fact-finding exercises or to carry out verifications on the spot. Nor should such online platforms, which have made the reasonable efforts required by this Regulation, be understood as guaranteeing the reliability of the information towards consumer or other interested parties. Such online platforms should also design and organise their online interface in a way that enables traders to comply with their obligations under Union law, in particular the requirements set out in Articles 6 and 8 of Directive 2011/83/EU of the European Parliament and of the Council, Article 7 of Directive 2005/29/EC of the European Parliament and of the Council47 and Article 3 of Directive 98/6/EC of the European Parliament and of the Council48. 
  51. In view of the particular responsibilities and obligations of online platforms, they should be made subject to transparency reporting obligations, which apply in addition to the transparency reporting obligations applicable to all providers of intermediary services under this Regulation. For the purposes of determining whether online platforms may be very large online platforms that are subject to certain additional obligations under this Regulation, the transparency reporting obligations for online platforms should include certain obligations relating to the publication and communication of information on the average monthly active recipients of the service in the Union. 
  52. Online advertisement plays an important role in the online environment, including in relation to the provision of the services of online platforms. However, online advertisement can contribute to significant risks, ranging from advertisement that is itself illegal content, to contributing to financial incentives for the publication or amplification of illegal or otherwise harmful content and activities online, or the discriminatory display of advertising with an impact on the equal treatment and opportunities of citizens. In addition to the requirements resulting from Article 6 of Directive 2000/31/EC, online platforms should therefore be required to ensure that the recipients of the service have certain individualised information necessary for them to understand when and on whose behalf the advertisement is displayed. In addition, recipients of the service should have information on the main parameters used for determining that specific advertising is to be displayed to them, providing meaningful explanations of the logic used to that end, including when this is based on profiling. The requirements of this Regulation on the provision of information relating to advertisement is without prejudice to the application of the relevant provisions of Regulation (EU) 2016/679, in particular those regarding the right to object, automated individual decision-making, including profiling and specifically the need to obtain consent of the data subject prior to the processing of personal data for targeted advertising. Similarly, it is without prejudice to the provisions laid down in Directive 2002/58/EC in particular those regarding the storage of information in terminal equipment and the access to information stored therein. 
  53. Given the importance of very large online platforms, due to their reach, in particular as expressed in number of recipients of the service, in facilitating public debate, economic transactions and the dissemination of information, opinions and ideas and in influencing how recipients obtain and communicate information online, it is necessary to impose specific obligations on those platforms, in addition to the obligations applicable to all online platforms. Those additional obligations on very large online platforms are necessary to address those public policy concerns, there being no alternative and less restrictive measures that would effectively achieve the same result. 
  54. Very large online platforms may cause societal risks, different in scope and impact from those caused by smaller platforms. Once the number of recipients of a platform reaches a significant share of the Union population, the systemic risks the platform poses have a disproportionately negative impact in the Union. Such significant reach should be considered to exist where the number of recipients exceeds an operational threshold set at 45 million, that is, a number equivalent to 10% of the Union population. The operational threshold should be kept up to date through amendments enacted by delegated acts, where necessary. Such very large online platforms should therefore bear the highest standard of due diligence obligations, proportionate to their societal impact and means. 
  55. In view of the network effects characterising the platform economy, the user base of an online platform may quickly expand and reach the dimension of a very large online platform, with the related impact on the internal market. This may be the case in the event of exponential growth experienced in short periods of time, or by a large global presence and turnover allowing the online platform to fully exploit network effects and economies of scale and of scope. A high annual turnover or market capitalisation can in particular be an indication of fast scalability in terms of user reach. In those cases, the Digital Services Coordinator should be able to request more frequent reporting from the platform on the user base to be able to timely identify the moment at which that platform should be designated as a very large online platform for the purposes of this Regulation. 
  56. Very large online platforms are used in a way that strongly influences safety online, the shaping of public opinion and discourse, as well as on online trade. The way they design their services is generally optimised to benefit their often advertising-driven business models and can cause societal concerns. In the absence of effective regulation and enforcement, they can set the rules of the game, without effectively identifying and mitigating the risks and the societal and economic harm they can cause. Under this Regulation, very large online platforms should therefore assess the systemic risks stemming from the functioning and use of their service, as well as by potential misuses by the recipients of the service, and take appropriate mitigating measures. 
  57. Three categories of systemic risks should be assessed in-depth. A first category concerns the risks associated with the misuse of their service through the dissemination of illegal content, such as the dissemination of child sexual abuse material or illegal hate speech, and the conduct of illegal activities, such as the sale of products or services prohibited by Union or national law, including counterfeit products. For example, and without prejudice to the personal responsibility of the recipient of the service of very large online platforms for possible illegality of his or her activity under the applicable law, such dissemination or activities may constitute a significant systematic risk where access to such content may be amplified through accounts with a particularly wide reach. A second category concerns the impact of the service on the exercise of fundamental rights, as protected by the Charter of Fundamental Rights, including the freedom of expression and information, the right to private life, the right to non-discrimination and the rights of the child. Such risks may arise, for example, in relation to the design of the algorithmic systems used by the very large online platform or the misuse of their service through the submission of abusive notices or other methods for silencing speech or hampering competition. A third category of risks concerns the intentional and, oftentimes, coordinated manipulation of the platform’s service, with a foreseeable impact on health, civic discourse, electoral processes, public security and protection of minors, having regard to the need to safeguard public order, protect privacy and fight fraudulent and deceptive commercial practices. Such risks may arise, for example, through the creation of fake accounts, the use of bots, and other automated or partially automated behaviours, which may lead to the rapid and widespread dissemination of information that is illegal content or incompatible with an online platform’s terms and conditions. 
  58. Very large online platforms should deploy the necessary means to diligently mitigate the systemic risks identified in the risk assessment. Very large online platforms should under such mitigating measures consider, for example, enhancing or otherwise adapting the design and functioning of their content moderation, algorithmic recommender systems and online interfaces, so that they discourage and limit the dissemination of illegal content, adapting their decision-making processes, or adapting their terms and conditions. They may also include corrective measures, such as discontinuing advertising revenue for specific content, or other actions, such as improving the visibility of authoritative information sources. Very large online platforms may reinforce their internal processes or supervision of any of their activities, in particular as regards the detection of systemic risks. They may also initiate or increase cooperation with trusted flaggers, organise training sessions and exchanges with trusted flagger organisations, and cooperate with other service providers, including by initiating or joining existing codes of conduct or other self-regulatory measures. Any measures adopted should respect the due diligence requirements of this Regulation and be effective and appropriate for mitigating the specific risks identified, in the interest of safeguarding public order, protecting privacy and fighting fraudulent and deceptive commercial practices, and should be proportionate in light of the very large online platform’s economic capacity and the need to avoid unnecessary restrictions on the use of their service, taking due account of potential negative effects on the fundamental rights of the recipients of the service. 
  59. Very large online platforms should, where appropriate, conduct their risk assessments and design their risk mitigation measures with the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. 
  60. Given the need to ensure verification by independent experts, very large online platforms should be accountable, through independent auditing, for their compliance with the obligations laid down by this Regulation and, where relevant, any complementary commitments undertaking pursuant to codes of conduct and crises protocols. They should give the auditor access to all relevant data necessary to perform the audit properly. Auditors should also be able to make use of other sources of objective information, including studies by vetted researchers. Auditors should guarantee the confidentiality, security and integrity of the information, such as trade secrets, that they obtain when performing their tasks and have the necessary expertise in the area of risk management and technical competence to audit algorithms. Auditors should be independent, so as to be able to perform their tasks in an adequate and trustworthy manner. If their independence is not beyond doubt, they should resign or abstain from the audit engagement. 
  61. The audit report should be substantiated, so as to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the very large online platform to comply with their obligations under this Regulation. The report should be transmitted to the Digital Services Coordinator of establishment and the Board without delay, together with the risk assessment and the mitigation measures, as well as the platform’s plans for addressing the audit’s recommendations. The report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A positive opinion should be given where all evidence shows that the very large online platform complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A positive opinion should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A negative opinion should be given where the auditor considers that the very large online platform does not comply with this Regulation or the commitments undertaken. 
  62. A core part of a very large online platform’s business is the manner in which information is prioritised and presented on its online interface to facilitate and optimise access to information for the recipients of the service. This is done, for example, by algorithmically suggesting, ranking and prioritising information, distinguishing through text or other visual representations, or otherwise curating information provided by recipients. Such recommender systems can have a significant impact on the ability of recipients to retrieve and interact with information online. They also play an important role in the amplification of certain messages, the viral dissemination of information and the stimulation of online behaviour. Consequently, very large online platforms should ensure that recipients are appropriately informed, and can influence the information presented to them. They should clearly present the main parameters for such recommender systems in an easily comprehensible manner to ensure that the recipients understand how information is prioritised for them. They should also ensure that the recipients enjoy alternative options for the main parameters, including options that are not based on profiling of the recipient. 
  63. Advertising systems used by very large online platforms pose particular risks and require further public and regulatory supervision on account of their scale and ability to target and reach recipients of the service based on their behaviour within and outside that platform’s online interface. Very large online platforms should ensure public access to repositories of advertisements displayed on their online interfaces to facilitate supervision and research into emerging risks brought about by the distribution of advertising online, for example in relation to illegal advertisements or manipulative techniques and disinformation with a real and foreseeable negative impact on public health, public security, civil discourse, political participation and equality. Repositories should include the content of advertisements and related data on the advertiser and the delivery of the advertisement, in particular where targeted advertising is concerned. 
  64. In order to appropriately supervise the compliance of very large online platforms with the obligations laid down by this Regulation, the Digital Services Coordinator of establishment or the Commission may require access to or reporting of specific data. Such a requirement may include, for example, the data necessary to assess the risks and possible harms brought about by the platform’s systems, data on the accuracy, functioning and testing of algorithmic systems for content moderation, recommender systems or advertising systems, or data on processes and outputs of content moderation or of internal complaint-handling systems within the meaning of this Regulation. Investigations by researchers on the evolution and severity of online systemic risks are particularly important for bridging information asymmetries and establishing a resilient system of risk mitigation, informing online platforms, Digital Services Coordinators, other competent authorities, the Commission and the public. This Regulation therefore provides a framework for compelling access to data from very large online platforms to vetted researchers. All requirements for access to data under that framework should be proportionate and appropriately protect the rights and legitimate interests, including trade secrets and other confidential information, of the platform and any other parties concerned, including the recipients of the service. 
  65. Given the complexity of the functioning of the systems deployed and the systemic risks they present to society, very large online platforms should appoint compliance officers, which should have the necessary qualifications to operationalise measures and monitor the compliance with this Regulation within the platform’s organisation. Very large online platforms should ensure that the compliance officer is involved, properly and in a timely manner, in all issues which relate to this Regulation. In view of the additional risks relating to their activities and their additional obligations under this Regulation, the other transparency requirements set out in this Regulation should be complemented by additional transparency requirements applicable specifically to very large online platforms, notably to report on the risk assessments performed and subsequent measures adopted as provided by this Regulation. 
  66. To facilitate the effective and consistent application of the obligations in this Regulation that may require implementation through technological means, it is important to promote voluntary industry standards covering certain technical procedures, where the industry can help develop standardised means to comply with this Regulation, such as allowing the submission of notices, including through application programming interfaces, or about the interoperability of advertisement repositories. Such standards could in particular be useful for relatively small providers of intermediary services. The standards could distinguish between different types of illegal content or different types of intermediary services, as appropriate. 
  67. The Commission and the Board should encourage the drawing-up of codes of conduct to contribute to the application of this Regulation. While the implementation of codes of conduct should be measurable and subject to public oversight, this should not impair the voluntary nature of such codes and the freedom of interested parties to decide whether to participate. In certain circumstances, it is important that very large online platforms cooperate in the drawing-up and adhere to specific codes of conduct. Nothing in this Regulation prevents other service providers from adhering to the same standards of due diligence, adopting best practices and benefitting from the guidance provided by the Commission and the Board, by participating in the same codes of conduct.  
  68. It is appropriate that this Regulation identify certain areas of consideration for such codes of conduct. In particular, risk mitigation measures concerning specific types of illegal content should be explored via self- and co-regulatory agreements. Another area for consideration is the possible negative impacts of systemic risks on society and democracy, such as disinformation or manipulative and abusive activities. This includes coordinated operations aimed at amplifying information, including disinformation, such as the use of bots or fake accounts for the creation of fake or misleading information, sometimes with a purpose of obtaining economic gain, which are particularly harmful for vulnerable recipients of the service, such as children. In relation to such areas, adherence to and compliance with a given code of conduct by a very large online platform may be considered as an appropriate risk mitigating measure. The refusal without proper explanations by an online platform of the Commission’s invitation to participate in the application of such a code of conduct could be taken into account, where relevant, when determining whether the online platform has infringed the obligations laid down by this Regulation. 
  69. The rules on codes of conduct under this Regulation could serve as a basis for already established self-regulatory efforts at Union level, including the Product Safety Pledge, the Memorandum of Understanding against counterfeit goods, the Code of Conduct against illegal hate speech as well as the Code of practice on disinformation. In particular for the latter, the Commission will issue guidance for strengthening the Code of practice on disinformation as announced in the European Democracy Action Plan. 
  70. The provision of online advertising generally involves several actors, including intermediary services that connect publishers of advertising with advertisers. Codes of conducts should support and complement the transparency obligations relating to advertisement for online platforms and very large online platforms set out in this Regulation in order to provide for flexible and effective mechanisms to facilitate and enhance the compliance with those obligations, notably as concerns the modalities of the transmission of the relevant information. The involvement of a wide range of stakeholders should ensure that those codes of conduct are widely supported, technically sound, effective and offer the highest levels of user-friendliness to ensure that the transparency obligations achieve their objectives. 
  71. In case of extraordinary circumstances affecting public security or public health, the Commission may initiate the drawing up of crisis protocols to coordinate a rapid, collective and cross-border response in the online environment. Extraordinary circumstances may entail any unforeseeable event, such as earthquakes, hurricanes, pandemics and other serious cross-border threats to public health, war and acts of terrorism, where, for example, online platforms may be misused for the rapid spread of illegal content or disinformation or where the need arises for rapid dissemination of reliable information. In light of the important role of very large online platforms in disseminating information in our societies and across borders, such platforms should be encouraged in drawing up and applying specific crisis protocols. Such crisis protocols should be activated only for a limited period of time and the measures adopted should also be limited to what is strictly necessary to address the extraordinary circumstance. Those measures should be consistent with this Regulation, and should not amount to a general obligation for the participating very large online platforms to monitor the information which they transmit or store, nor actively to seek facts or circumstances indicating illegal content. 
  72. The task of ensuring adequate oversight and enforcement of the obligations laid down in this Regulation should in principle be attributed to the Member States. To this end, they should appoint at least one authority with the task to apply and enforce this Regulation. Member States should however be able to entrust more than one competent authority, with specific supervisory or enforcement tasks and competences concerning the application of this Regulation, for example for specific sectors, such as electronic communications’ regulators, media regulators or consumer protection authorities, reflecting their domestic constitutional, organisational and administrative structure. 
  73. Given the cross-border nature of the services at stake and the horizontal range of obligations introduced by this Regulation, the authority appointed with the task of supervising the application and, where necessary, enforcing this Regulation should be identified as a Digital Services Coordinator in each Member State. Where more than one competent authority is appointed to apply and enforce this Regulation, only one authority in that Member State should be identified as a Digital Services Coordinator. The Digital Services Coordinator should act as the single contact point with regard to all matters related to the application of this Regulation for the Commission, the Board, the Digital Services Coordinators of other Member States, as well as for other competent authorities of the Member State in question. In particular, where several competent authorities are entrusted with tasks under this Regulation in a given Member State, the Digital Services Coordinator should coordinate and cooperate with those authorities in accordance with the national law setting their respective tasks, and should ensure effective involvement of all relevant authorities in the supervision and enforcement at Union level. 
  74. The Digital Services Coordinator, as well as other competent authorities designated under this Regulation, play a crucial role in ensuring the effectiveness of the rights and obligations laid down in this Regulation and the achievement of its objectives. Accordingly, it is necessary to ensure that those authorities act in complete independence from private and public bodies, without the obligation or possibility to seek or receive instructions, including from the government, and without prejudice to the specific duties to cooperate with other competent authorities, the Digital Services Coordinators, the Board and the Commission. On the other hand, the independence of these authorities should not mean that they cannot be subject, in accordance with national constitutions and without endangering the achievement of the objectives of this Regulation, to national control or monitoring mechanisms regarding their financial expenditure or to judicial review, or that they should not have the possibility to consult other national authorities, including law enforcement authorities or crisis management authorities, where appropriate. 
  75. Member States can designate an existing national authority with the function of the Digital Services Coordinator, or with specific tasks to apply and enforce this Regulation, provided that any such appointed authority complies with the requirements laid down in this Regulation, such as in relation to its independence. Moreover, Member States are in principle not precluded from merging functions within an existing authority, in accordance with Union law. The measures to that effect may include, inter alia, the preclusion to dismiss the President or a board member of a collegiate body of an existing authority before the expiry of their terms of office, on the sole ground that an institutional reform has taken place involving the merger of different functions within one authority, in the absence of any rules guaranteeing that such dismissals do not jeopardise the independence and impartiality of such members. 
  76. In the absence of a general requirement for providers of intermediary services to ensure a physical presence within the territory of one of the Member States, there is a need to ensure clarity under which Member State’s jurisdiction those providers fall for the purposes of enforcing the rules laid down in Chapters III and IV by the national competent authorities. A provider should be under the jurisdiction of the Member State where its main establishment is located, that is, where the provider has its head office or registered office within which the principal financial functions and operational control are exercised. In respect of providers that do not have an establishment in the Union but that offer services in the Union and therefore fall within the scope of this Regulation, the Member State where those providers appointed their legal representative should have jurisdiction, considering the function of legal representatives under this Regulation. In the interest of the effective application of this Regulation, all Member States should, however, have jurisdiction in respect of providers that failed to designate a legal representative, provided that the principle of ne bis in idem is respected. To that aim, each Member State that exercises jurisdiction in respect of such providers should, without undue delay, inform all other Member States of the measures they have taken in the exercise of that jurisdiction. 
  77. Member States should provide the Digital Services Coordinator, and any other competent authority designated under this Regulation, with sufficient powers and means to ensure effective investigation and enforcement. Digital Services Coordinators should in particular be able to search for and obtain information which is located in its territory, including in the context of joint investigations, with due regard to the fact that oversight and enforcement measures concerning a provider under the jurisdiction of another Member State should be adopted by the Digital Services Coordinator of that other Member State, where relevant in accordance with the procedures relating to cross-border cooperation. 
  78. Member States should set out in their national law, in accordance with Union law and in particular this Regulation and the Charter, the detailed conditions and limits for the exercise of the investigatory and enforcement powers of their Digital Services Coordinators, and other competent authorities where relevant, under this Regulation. 
  79. In the course of the exercise of those powers, the competent authorities should comply with the applicable national rules regarding procedures and matters such as the need for a prior judicial authorisation to enter certain premises and legal professional privilege. Those provisions should in particular ensure respect for the fundamental rights to an effective remedy and to a fair trial, including the rights of defence, and, the right to respect for private life. In this regard, the guarantees provided for in relation to the proceedings of the Commission pursuant to this Regulation could serve as an appropriate point of reference. A prior, fair and impartial procedure should be guaranteed before taking any final decision, including the right to be heard of the persons concerned, and the right to have access to the file, while respecting confidentiality and professional and business secrecy, as well as the obligation to give meaningful reasons for the decisions. This should not preclude the taking of measures, however, in duly substantiated cases of urgency and subject to appropriate conditions and procedural arrangements. The exercise of powers should also be proportionate to, inter alia the nature and the overall actual or potential harm caused by the infringement or suspected infringement. The competent authorities should in principle take all relevant facts and circumstances of the case into account, including information gathered by competent authorities in other Member States. 
  80. Member States should ensure that violations of the obligations laid down in this Regulation can be sanctioned in a manner that is effective, proportionate and dissuasive, taking into account the nature, gravity, recurrence and duration of the violation, in view of the public interest pursued, the scope and kind of activities carried out, as well as the economic capacity of the infringer. In particular, penalties should take into account whether the provider of intermediary services concerned systematically or recurrently fails to comply with its obligations stemming from this Regulation, as well as, where relevant, whether the provider is active in several Member States. 
  81. In order to ensure effective enforcement of this Regulation, individuals or representative organisations should be able to lodge any complaint related to compliance with this Regulation with the Digital Services Coordinator in the territory where they received the service, without prejudice to this Regulation’s rules on jurisdiction. Complaints should provide a faithful overview of concerns related to a particular intermediary service provider’s compliance and could also inform the Digital Services Coordinator of any more cross-cutting issues. The Digital Services Coordinator should involve other national competent authorities as well as the Digital Services Coordinator of another Member State, and in particular the one of the Member State where the provider of intermediary services concerned is established, if the issue requires cross-border cooperation. 
  82. Member States should ensure that Digital Services Coordinators can take measures that are effective in addressing and proportionate to certain particularly serious and persistent infringements. Especially where those measures can affect the rights and interests of third parties, as may be the case in particular where the access to online interfaces is restricted, it is appropriate to require that the measures be ordered by a competent judicial authority at the Digital Service Coordinators’ request and are subject to additional safeguards. In particular, third parties potentially affected should be afforded the opportunity to be heard and such orders should only be issued when powers to take such measures as provided by other acts of Union law or by national law, for instance to protect collective interests of consumers, to ensure the prompt removal of web pages containing or disseminating child pornography, or to disable access to services are being used by a third party to infringe an intellectual property right, are not reasonably available. 
  83. Such an order to restrict access should not go beyond what is necessary to achieve its objective. For that purpose, it should be temporary and be addressed in principle to a provider of intermediary services, such as the relevant hosting service provider, internet service provider or domain registry or registrar, which is in a reasonable position to achieve that objective without unduly restricting access to lawful information. 
  84. The Digital Services Coordinator should regularly publish a report on the activities carried out under this Regulation. Given that the Digital Services Coordinator is also made aware of orders to take action against illegal content or to provide information regulated by this Regulation through the common information sharing system, the Digital Services Coordinator should include in its annual report the number and categories of these orders addressed to providers of intermediary services issued by judicial and administrative authorities in its Member State. 
  85. Where a Digital Services Coordinator requests another Digital Services Coordinator to take action, the requesting Digital Services Coordinator, or the Board in case it issued a recommendation to assess issues involving more than three Member States, should be able to refer the matter to the Commission in case of any disagreement as to the assessments or the measures taken or proposed or a failure to adopt any measures. The Commission, on the basis of the information made available by the concerned authorities, should accordingly be able to request the competent Digital Services Coordinator to re-assess the matter and take the necessary measures to ensure compliance within a defined time period. This possibility is without prejudice to the Commission’s general duty to oversee the application of, and where necessary enforce, Union law under the control of the Court of Justice of the European Union in accordance with the Treaties. A failure by the Digital Services Coordinator of establishment to take any measures pursuant to such a request may also lead to the Commission’s intervention under Section 3 of Chapter IV of this Regulation, where the suspected infringer is a very large online platform 
  86. In order to facilitate cross-border supervision and investigations involving several Member States, the Digital Services Coordinators should be able to participate, on a permanent or temporary basis, in joint oversight and investigation activities concerning matters covered by this Regulation. Those activities may include other competent authorities and may cover a variety of issues, ranging from coordinated data gathering exercises to requests for information or inspections of premises, within the limits and scope of powers available to each participating authority. The Board may be requested to provide advice in relation to those activities, for example by proposing roadmaps and timelines for activities or proposing ad-hoc task-forces with participation of the authorities involved. 
  87. In view of the particular challenges that may emerge in relation to assessing and ensuring a very large online platform’s compliance, for instance relating to the scale or complexity of a suspected infringement or the need for particular expertise or capabilities at Union level, Digital Services Coordinators should have the possibility to request, on a voluntary basis, the Commission to intervene and exercise its investigatory and enforcement powers under this Regulation. 
  88. In order to ensure a consistent application of this Regulation, it is necessary to set up an independent advisory group at Union level, which should support the Commission and help coordinate the actions of Digital Services Coordinators. That European Board for Digital Services should consist of the Digital Services Coordinators, without prejudice to the possibility for Digital Services Coordinators to invite in its meetings or appoint ad hoc delegates from other competent authorities entrusted with specific tasks under this Regulation, where that is required pursuant to their national allocation of tasks and competences. In case of multiple participants from one Member State, the voting right should remain limited to one representative per Member State. 
  89. The Board should contribute to achieving a common Union perspective on the consistent application of this Regulation and to cooperation among competent authorities, including by advising the Commission and the Digital Services Coordinators about appropriate investigation and enforcement measures, in particular vis à vis very large online platforms. The Board should also contribute to the drafting of relevant templates and codes of conduct and analyse emerging general trends in the development of digital services in the Union.
  90. For that purpose, the Board should be able to adopt opinions, requests and recommendations addressed to Digital Services Coordinators or other competent national authorities. While not legally binding, the decision to deviate therefrom should be properly explained and could be taken into account by the Commission in assessing the compliance of the Member State concerned with this Regulation. 
  91. The Board should bring together the representatives of the Digital Services Coordinators and possible other competent authorities under the chairmanship of the Commission, with a view to ensuring an assessment of matters submitted to it in a fully European dimension. In view of possible cross-cutting elements that may be of relevance for other regulatory frameworks at Union level, the Board should be allowed to cooperate with other Union bodies, offices, agencies and advisory groups with responsibilities in fields such as equality, including equality between women and men, and non-discrimination, data protection, electronic communications, audiovisual services, detection and investigation of frauds against the EU budget as regards custom duties, or consumer protection, as necessary for the performance of its tasks. 
  92. The Commission, through the Chair, should participate in the Board without voting rights. Through the Chair, the Commission should ensure that the agenda of the meetings is set in accordance with the requests of the members of the Board as laid down in the rules of procedure and in compliance with the duties of the Board laid down in this Regulation. 
  93. In view of the need to ensure support for the Board’s activities, the Board should be able to rely on the expertise and human resources of the Commission and of the competent national authorities. The specific operational arrangements for the internal functioning of the Board should be further specified in the rules of procedure of the Board. 
  94. Given the importance of very large online platforms, in view of their reach and impact, their failure to comply with the specific obligations applicable to them may affect a substantial number of recipients of the services across different Member States and may cause large societal harms, while such failures may also be particularly complex to identify and address. 
  95. In order to address those public policy concerns it is therefore necessary to provide for a common system of enhanced supervision and enforcement at Union level. Once an infringement of one of the provisions that solely apply to very large online platforms has been identified, for instance pursuant to individual or joint investigations, auditing or complaints, the Digital Services Coordinator of establishment, upon its own initiative or upon the Board’s advice, should monitor any subsequent measure taken by the very large online platform concerned as set out in its action plan. That Digital Services Coordinator should be able to ask, where appropriate, for an additional, specific audit to be carried out, on a voluntary basis, to establish whether those measures are sufficient to address the infringement. At the end of that procedure, it should inform the Board, the Commission and the platform concerned of its views on whether or not that platform addressed the infringement, specifying in particular the relevant conduct and its assessment of any measures taken. The Digital Services Coordinator should perform its role under this common system in a timely manner and taking utmost account of any opinions and other advice of the Board. 
  96. Where the infringement of the provision that solely applies to very large online platforms is not effectively addressed by that platform pursuant to the action plan, only the Commission may, on its own initiative or upon advice of the Board, decide to further investigate the infringement concerned and the measures that the platform has subsequently taken, to the exclusion of the Digital Services Coordinator of establishment. After having conducted the necessary investigations, the Commission should be able to issue decisions finding an infringement and imposing sanctions in respect of very large online platforms where that is justified. It should also have such a possibility to intervene in cross-border situations where the Digital Services Coordinator of establishment did not take any measures despite the Commission’s request, or in situations where the Digital Services Coordinator of establishment itself requested for the Commission to intervene, in respect of an infringement of any other provision of this Regulation committed by a very large online platform. 
  97. The Commission should remain free to decide whether or not it wishes to intervene in any of the situations where it is empowered to do so under this Regulation. Once the Commission initiated the proceedings, the Digital Services Coordinators of establishment concerned should be precluded from exercising their investigatory and enforcement powers in respect of the relevant conduct of the very large online platform concerned, so as to avoid duplication, inconsistencies and risks from the viewpoint of the principle of ne bis in idem. However, in the interest of effectiveness, those Digital Services Coordinators should not be precluded from exercising their powers either to assist the Commission, at its request in the performance of its supervisory tasks, or in respect of other conduct, including conduct by the same very large online platform that is suspected to constitute a new infringement. Those Digital Services Coordinators, as well as the Board and other Digital Services Coordinators where relevant, should provide the Commission with all necessary information and assistance to allow it to perform its tasks effectively, whilst conversely the Commission should keep them informed on the exercise of its powers as appropriate. In that regard, the Commission should, where appropriate, take account of any relevant assessments carried out by the Board or by the Digital Services Coordinators concerned and of any relevant evidence and information gathered by them, without prejudice to the Commission’s powers and responsibility to carry out additional investigations as necessary. 
  98. In view of both the particular challenges that may arise in seeking to ensure compliance by very large online platforms and the importance of doing so effectively, considering their size and impact and the harms that they may cause, the Commission should have strong investigative and enforcement powers to allow it to investigate, enforce and monitor certain of the rules laid down in this Regulation, in full respect of the principle of proportionality and the rights and interests of the affected parties. 
  99. In particular, the Commission should have access to any relevant documents, data and information necessary to open and conduct investigations and to monitor the compliance with the relevant obligations laid down in this Regulation, irrespective of who possesses the documents, data or information in question, and regardless of their form or format, their storage medium, or the precise place where they are stored. The Commission should be able to directly require that the very large online platform concerned or relevant third parties, or than individuals, provide any relevant evidence, data and information. In addition, the Commission should be able to request any relevant information from any public authority, body or agency within the Member State, or from any natural person or legal person for the purpose of this Regulation. The Commission should be empowered to require access to, and explanations relating to, data-bases and algorithms of relevant persons, and to interview, with their consent, any persons who may be in possession of useful information and to record the statements made. The Commission should also be empowered to undertake such inspections as are necessary to enforce the relevant provisions of this Regulation. Those investigatory powers aim to complement the Commission’s possibility to ask Digital Services Coordinators and other Member States’ authorities for assistance, for instance by providing information or in the exercise of those powers 
  100. Compliance with the relevant obligations imposed under this Regulation should be enforceable by means of fines and periodic penalty payments. To that end, appropriate levels of fines and periodic penalty payments should also be laid down for non-compliance with the obligations and breach of the procedural rules, subject to appropriate limitation periods. 
  101. The very large online platforms concerned and other persons subject to the exercise of the Commission’s powers whose interests may be affected by a decision should be given the opportunity of submitting their observations beforehand, and the decisions taken should be widely publicised. While ensuring the rights of defence of the parties concerned, in particular, the right of access to the file, it is essential that confidential information be protected. Furthermore, while respecting the confidentiality of the information, the Commission should ensure that any information relied on for the purpose of its decision is disclosed to an extent that allows the addressee of the decision to understand the facts and considerations that lead up to the decision. 
  102. In the interest of effectiveness and efficiency, in addition to the general evaluation of the Regulation, to be performed within five years of entry into force, after the initial start-up phase and on the basis of the first three years of application of this Regulation, the Commission should also perform an evaluation of the activities of the Board and on its structure. 
  103. In order to ensure uniform conditions for the implementation of this Regulation, implementing powers should be conferred on the Commission. Those powers should be exercised in accordance with Regulation (EU) No 182/2011 of the European Parliament and of the Council49. 
  104. In order to fulfil the objectives of this Regulation, the power to adopt acts in accordance with Article 290 of the Treaty should be delegated to the Commission to supplement this Regulation. In particular, delegated acts should be adopted in respect of criteria for identification of very large online platforms and of technical specifications for access requests. It is of particular importance that the Commission carries out appropriate consultations and that those consultations be conducted in accordance with the principles laid down in the Interinstitutional Agreement on Better Law-Making of 13 April 2016. In particular, to ensure equal participation in the preparation of delegated acts, the European Parliament and the Council receive all documents at the same time as Member States’ experts, and their experts systematically have access to meetings of Commission expert groups dealing with the preparation of delegated acts. 
  105. This Regulation respects the fundamental rights recognised by the Charter and the fundamental rights constituting general principles of Union law. Accordingly, this Regulation should be interpreted and applied in accordance with those fundamental rights, including the freedom of expression and information, as well as the freedom and pluralism of the media. When exercising the powers set out in this Regulation, all public authorities involved should achieve, in situations where the relevant fundamental rights conflict, a fair balance between the rights concerned, in accordance with the principle of proportionality. 
  106. Since the objective of this Regulation, namely the proper functioning of the internal market and to ensure a safe, predictable and trusted online environment in which the fundamental rights enshrined in the Charter are duly protected, cannot be sufficiently achieved by the Member States because they cannot achieve the necessary harmonisation and cooperation by acting alone, but can rather, by reason of its territorial and personal scope, be better achieved at the Union level, the Union may adopt measures, in accordance with the principle of subsidiarity as set out in Article 5 of the Treaty on European Union. In accordance with the principle of proportionality, as set out in that Article, this Regulation does not go beyond what is necessary in order to achieve that objective, 

HAVE ADOPTED THIS REGULATION: 

Chapter I

General provisions 

Article 1
Subject matter and scope 
  1. This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes: 
    1. framework for the conditional exemption from liability of providers of intermediary services; 
    2. rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services; 
    3. rules on the implementation and enforcement of this Regulation, including as regards the cooperation of and coordination between the competent authorities. 
  2. The aims of this Regulation are to: 
    1. contribute to the proper functioning of the internal market for intermediary services; 
    2. set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. 
  3. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. 
  4. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service. 
  5. This Regulation is without prejudice to the rules laid down by the following: 
    1. Directive 2000/31/EC; 
    2. Directive 2010/13/EC; 
    3. Union law on copyright and related rights;
    4. Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted]; 
    5. Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted] 
    6. Regulation (EU) 2019/1148; 
    7. Regulation (EU) 2019/1150; 
    8. Union law on consumer protection and product safety, including Regulation (EU) 2017/2394; 
    9. Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. 

Article 2
Definitions 

For the purpose of this Regulation, the following definitions shall apply: 

  1. ‘information society services’ means services within the meaning of Article 1(1)(b) of Directive (EU) 2015/1535; 
  2. ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service; 
  3. ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business or profession; 
  4. ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the provider of information society services which has a substantial connection to the Union; such a substantial connection is deemed to exist where the provider has an establishment in the Union; in the absence of such an establishment, the assessment of a substantial connection is based on specific factual criteria, such as: 
    1. a significant number of users in one or more Member States; or
    2. the targeting of activities towards one or more Member States. 
  5. ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession; 
  6. ‘intermediary service’ means one of the following services: 
    1. a ‘mere conduit’ service that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network; 
    2. a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, for the sole purpose of making more efficient the information’s onward transmission to other recipients upon their request; 
    3. a ‘hosting’ service that consists of the storage of information provided by, and at the request of, a recipient of the service; 
  7. ‘illegal content’ means any information,, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; 
  8. ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation. 
  9. ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties; 
  10. ‘distance contract’ means a contract within the meaning of Article 2(7) of Directive 2011/83/EU; 
  11. ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications; 
  12. ‘Digital Services Coordinator of establishment’ means the Digital Services Coordinator of the Member State where the provider of an intermediary service is established or its legal representative resides or is established; 
  13. ‘Digital Services Coordinator of destination’ means the Digital Services Coordinator of a Member State where the intermediary service is provided; 
  14. ‘advertisement’ means information designed to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically for promoting that information; 
  15. ‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; 
  16. ‘content moderation’ means the activities undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; 
  17. ‘terms and conditions’ means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.

Chapter II

Liability of providers of intermediary services 

Article 3
‘Mere conduit’ 
  1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, the service provider shall not be liable for the information transmitted, on condition that the provider: 
    1. does not initiate the transmission; 
    2. does not select the receiver of the transmission; and 
    3. does not select or modify the information contained in the transmission. 
  2. The acts of transmission and of provision of access referred to in paragraph 1 include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission. 
  3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement. 

Article 4
‘Caching’ 
  1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information’s onward transmission to other recipients of the service upon their request, on condition that: 
    1. the provider does not modify the information; 
    2. the provider complies with conditions on access to the information; 
    3. the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; 
    4. the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and 
    5. the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. 
  2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.  

Article 5
Hosting 
  1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider: 
    1. does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or 
    2. upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. 
  2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider. 
  3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. 
  4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement. 

Article 6
Voluntary own-initiative investigations and legal compliance 

Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 and 5 solely because they carry out voluntary own-initiative investigations or other activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation. 

Article 7
No general monitoring or active fact-finding obligations 

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers. 

Article 8
Orders to act against illegal content 
  1. Providers of intermediary services shall, upon the receipt of an order to act against a specific item of illegal content, issued by the relevant national judicial or administrative authorities, on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action taken and the moment when the action was taken.  
  2. Member States shall ensure that the orders referred to in paragraph 1 meet the following conditions: 
    1. the orders contains the following elements: 
      1. a statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law infringed;
      2. one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned; 
      3. information about redress available to the provider of the service and to the recipient of the service who provided the content; 
    2. the territorial scope of the order, on the basis of the applicable rules of Union and national law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; 
    3. the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. 
  3. The Digital Services Coordinator from the Member State of the judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the orders referred to in paragraph 1 to all other Digital Services Coordinators through the system established in accordance with Article 67. 
  4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. 

Article 9
Orders to provide information 
  1. Providers of intermediary services shall, upon receipt of an order to provide a specific item of information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. 
  2. Member States shall ensure that orders referred to in paragraph 1 meet the following conditions: 
    1. the order contains the following elements: 
      1. a statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; 
      2. information about redress available to the provider and to the recipients of the service concerned; 
    2. the order only requires the provider to provide information already collected for the purposes of providing the service and which lies within its control; 
    3. the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10; 
  3. The Digital Services Coordinator from the Member State of the national judicial or administrative authority issuing the order shall, without undue delay, transmit a copy of the order referred to in paragraph 1 to all Digital Services Coordinators through the system established in accordance with Article 67. 
  4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law in conformity with Union law. 

Chapter III

Due diligence obligations for a transparent and safe online environment 

SECTION 1
PROVISIONS APPLICABLE TO ALL PROVIDERS OF INTERMEDIARY SERVICES 

Article 10
Points of contact 
  1. Providers of intermediary services shall establish a single point of contact allowing for direct communication, by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation. 
  2. Providers of intermediary services shall make public the information necessary to easily identify and communicate with their single points of contact. 
  3. Providers of intermediary services shall specify in the information referred to in paragraph 2, the official language or languages of the Union, which can be used to communicate with their points of contact and which shall include at least one of the official languages of the Member State in which the provider of intermediary services has its main establishment or where its legal representative resides or is established. 

Article 11
Legal representatives 
  1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person as their legal representative in one of the Member States where the provider offers its services. 
  2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource to cooperate with the Member States’ authorities, the Commission and the Board and comply with those decisions. 
  3. The designated legal representative can be held liable for non-compliance with obligations under this Regulation, without prejudice to the liability and legal actions that could be initiated against the provider of intermediary services. 
  4. Providers of intermediary services shall notify the name, address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. They shall ensure that that information is up to date. 
  5. The designation of a legal representative within the Union pursuant to paragraph 1 shall not amount to an establishment in the Union. 

Article 12
Terms and conditions 
  1. Providers of intermediary services shall include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, in their terms and conditions. That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. It shall be set out in clear and unambiguous language and shall be publicly available in an easily accessible format. 
  2. Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. 

Article 13
Transparency reporting obligations for providers of intermediary services 
  1. Providers of intermediary services shall publish, at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: 
    1. the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking the action specified in those orders; 
    2. the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average time needed for taking the action; 
    3. the content moderation engaged in at the providers’ own initiative, including the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; 
    4. the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average time needed for taking those decisions and the number of instances where those decisions were reversed. 
  2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. 

SECTION 2
ADDITIONAL PROVISIONS APPLICABLE TO PROVIDERS OF HOSTING SERVICES, INCLUDING ONLINE PLATFORMS 

Article 14
Notice and action mechanisms 
  1. Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. 
  2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. To that end, the providers shall take the necessary measures to enable and facilitate the submission of notices containing all of the following elements: 
    1. an explanation of the reasons why the individual or entity considers the information in question to be illegal content; 
    2. a clear indication of the electronic location of that information, in particular the exact URL or URLs, and, where necessary, additional information enabling the identification of the illegal content; 
    3. the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered to involve one of the offences referred to in Articles 3 to 7 of Directive 2011/93/EU; 
    4. a statement confirming the good faith belief of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete. 
  3. Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. 
  4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly send a confirmation of receipt of the notice to that individual or entity. 
  5. The provider shall also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the redress possibilities in respect of that decision. 
  6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, and take their decisions in respect of the information to which the notices relate, in a timely, diligent and objective manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. 

Article 15
Statement of reasons 
  1. Where a provider of hosting services decides to remove or disable access to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. 
  2. The statement of reasons referred to in paragraph 1 shall at least contain the following information: 
    1. whether the decision entails either the removal of, or the disabling of access to, the information and, where relevant, the territorial scope of the disabling of access; 
    2. the facts and circumstances relied on in taking the decision, including where relevant whether the decision was taken pursuant to a notice submitted in accordance with Article 14; 
    3. where applicable, information on the use made of automated means in taking the decision, including where the decision was taken in respect of content detected or identified using automated means; 
    4. where the decision concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; 
    5. where the decision is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; 
    6. information on the redress possibilities available to the recipient of the service in respect of the decision, in particular through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. 
  3. The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the redress possibilities referred to in point (f) of paragraph 2. 
  4. Providers of hosting services shall publish the decisions and the statements of reasons, referred to in paragraph 1 in a publicly accessible database managed by the Commission. That information shall not contain personal data.

SECTION 3
ADDITIONAL PROVISIONS APPLICABLE TO ONLINE PLATFORMS 

Article 16
Exclusion for micro and small enterprises 

This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. 

Article 17
Internal complaint-handling system 
  1. Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: 
    1. decisions to remove or disable access to the information; 
    2. decisions to suspend or terminate the provision of the service, in whole or in part, to the recipients; 
    3. decisions to suspend or terminate the recipients’ account. 
  2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. 
  3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, diligent and objective manner. Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. 
  4. Online platforms shall inform complainants without undue delay of the decision they have taken in respect of the information to which the complaint relates and shall inform complainants of the possibility of out-of-court dispute settlement provided for in Article 18 and other available redress possibilities. 
  5. Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. 

Article 18
Out-of-court dispute settlement 
  1. Recipients of the service addressed by the decisions referred to in Article 17(1), shall be entitled to select any out-of-court dispute that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms shall engage, in good faith, with the body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. 
  1. The first subparagraph is without prejudice to the right of the recipient concerned to redress against the decision before a court in accordance with the applicable law. 
  1. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, where the body has demonstrated that it meets all of the following conditions: 
    1. it is impartial and independent of online platforms and recipients of the service provided by the online platforms; 
    2. it has the necessary expertise in relation to the issues arising in one or more particular areas of illegal content, or in relation to the application and enforcement of terms and conditions of one or more types of online platforms, allowing the body to contribute effectively to the settlement of a dispute; 
    3. the dispute settlement is easily accessible through electronic communication technology; 
    4. it is capable of settling dispute in a swift, efficient and cost-effective manner and in at least one official language of the Union; 
    5. the dispute settlement takes place in accordance with clear and fair rules of procedure. 
  1. The Digital Services Coordinator shall, where applicable, specify in the certificate the particular issues to which the body’s expertise relates and the official language or languages of the Union in which the body is capable of settling disputes, as referred to in points (b) and (d) of the first subparagraph, respectively. 
  1. If the body decides the dispute in favour of the recipient of the service, the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has paid or is to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, the recipient shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. 
  1. The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. 
  2. Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlement. 

  1. Member States may establish out-of-court dispute settlement bodies for the purposes of paragraph 1 or support the activities of some or all out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2. 
  1. Member States shall ensure that any of their activities undertaken under the first subparagraph do not affect the ability of their Digital Services Coordinators to certify the bodies concerned in accordance with paragraph 2. 
  1. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph. The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated. 

6. This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive. 

Article 19
Trusted flaggers 
  1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers through the mechanisms referred to in Article 14, are processed and decided upon with priority and without delay. 
  2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: 
    1. it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; 
    2. it represents collective interests and is independent from any online platform; 
    3. it carries out its activities for the purposes of submitting notices in a timely, diligent and objective manner. 
  3. Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. 
  4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database and keep the database updated. 
  5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. 
  6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by third parties, including the information provided by an online platform pursuant to paragraph 5, that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger 
  7. The Commission, after consulting the Board, may issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 5 and 6.

Article 20
Measures and protection against misuse 
  1. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. 
  2. Online platforms shall suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded. 
  3. Online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the online platform. Those circumstances shall include at least the following: 
    1. the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; 
    2. the relative proportion thereof in relation to the total number of items of information provided or notices submitted in the past year; 
    3. the gravity of the misuses and its consequences; 
    4. the intention of the recipient, individual, entity or complainant. 
  4. Online platforms shall set out, in a clear and detailed manner, their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension. 

Article 21
Notification of suspicions of criminal offences 
  1. Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 
  2. Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. 
  • For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.

Article 22
Traceability of traders 
  1. Where an online platform allows consumers to conclude distance contracts with traders, it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained the following information: 
    1. the name, address, telephone number and electronic mail address of the trader; 
    2. a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council; 
    3. the bank account details of the trader, where the trader is a natural person; 
    4. the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council or any relevant act of Union law; 
    5. where the trader is registered in a trade register or similar public register, the trade register in which the trader is registered and its registration number or equivalent means of identification in that register; 
    6. a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law. 
  2. The online platform shall, upon receiving that information, make reasonable efforts to assess whether the information referred to in points (a), (d) and (e) of paragraph 1 is reliable through the use of any freely accessible official online database or online interface made available by a Member States or the Union or through requests to the trader to provide supporting documents from reliable sources. 
  3. Where the online platform obtains indications that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. 
    Where the trader fails to correct or complete that information, the online platform shall suspend the provision of its service to the trader until the request is complied with. 
  4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. 
  5. Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for the performance of their tasks under this Regulation. 
  6. The online platform shall make the information referred to in points (a), (d), (e) and (f) of paragraph 1 available to the recipients of the service, in a clear, easily accessible and comprehensible manner. 
  7. The online platform shall design and organise its online interface in a way that enables traders to comply with their obligations regarding pre-contractual information and product safety information under applicable Union law. 

Article 23
Transparency reporting obligations for providers of online platforms 
  1. In addition to the information referred to in Article 13, online platforms shall include in the reports referred to in that Article information on the following: 
    1. the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 18, the outcomes of the dispute settlement and the average time needed for completing the dispute settlement procedures; 
    2. the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; 
    3. any use made of automatic means for the purpose of content moderation, including a specification of the precise purposes, indicators of the accuracy of the automated means in fulfilling those purposes and any safeguards applied. 
  2. Online platforms shall publish, at least once every six months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2). 
  3. Online platforms shall communicate to the Digital Services Coordinator of establishment, upon its request, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator may require the online platform to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information shall not include personal data. 
  4. The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to paragraph 1. 

Article 24
Online advertising transparency 

Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear and unambiguous manner and in real time: 

  • that the information displayed is an advertisement; 
  • the natural or legal person on whose behalf the advertisement is displayed; 
  • meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. 

SECTION 4
ADDITIONAL OBLIGATIONS FOR VERY LARGE ONLINE PLATFORMS TO MANAGE SYSTEMIC RISKS 

Article 25
Very large online platforms 
  1. This Section shall apply to online platforms which provide their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. 
  2. The Commission shall adopt delegated acts in accordance with Article 69 to adjust the number of average monthly recipients of the service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means of a delegated act, of its population in the year in which the latest delegated act was adopted. In that case, it shall adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in millions. 
  3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features. 
  4. The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission. 
    The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication. 

Article 26
Risk assessment 
  1. Very large online platforms shall identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, any significant systemic risks stemming from the functioning and use made of their services in the Union. This risk assessment shall be specific to their services and shall include the following systemic risks:
    1. the dissemination of illegal content through their services; 
    2. any negative effects for the exercise of the fundamental rights to respect for private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child, as enshrined in Articles 7, 11, 21 and 24 of the Charter respectively; 
    3. intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, with an actual or foreseeable negative effect on the protection of public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. 
  2. When conducting risk assessments, very large online platforms shall take into account, in particular, how their content moderation systems, recommender systems and systems for selecting and displaying advertisement influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. 

Article 27
Mitigation of risks 
  1. Very large online platforms shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: 
    1. adapting content moderation or recommender systems, their decision-making processes, the features or functioning of their services, or their terms and conditions; 
    2. targeted measures aimed at limiting the display of advertisements in association with the service they provide; 
    3. reinforcing the internal processes or supervision of any of their activities in particular as regards detection of systemic risk;
    4. initiating or adjusting cooperation with trusted flaggers in accordance with Article 19; 
    5. initiating or adjusting cooperation with other online platforms through the codes of conduct and the crisis protocols referred to in Article 35 and 37 respectively. 
  2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which shall include the following: 
    1. identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article 31 and 33; 
    2. best practices for very large online platforms to mitigate the systemic risks identified. 
  3. The Commission, in cooperation with the Digital Services Coordinators, may issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission shall organise public consultations. 

Article 28
Independent audit 
  1. Very large online platforms shall be subject, at their own expense and at least once a year, to audits to assess compliance with the following: 
    1. the obligations set out in Chapter III; 
    2. any commitments undertaken pursuant to the codes of conduct referred to in Articles 35 and 36 and the crisis protocols referred to in Article 37. 
  2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: 
    1. are independent from the very large online platform concerned; 
    2. have proven expertise in the area of risk management, technical competence and capabilities; 
    3. have proven objectivity and professional ethics, based in particular on adherence to codes of practice or appropriate standards. 
  3. The organisations that perform the audits shall establish an audit report for each audit. The report shall be in writing and include at least the following: 
    1. the name, address and the point of contact of the very large online platform subject to the audit and the period covered; 
    2. the name and address of the organisation performing the audit; 
    3. a description of the specific elements audited, and the methodology applied; 
    4. a description of the main findings drawn from the audit; 
    5. an audit opinion on whether the very large online platform subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative; 
    6. where the audit opinion is not positive, operational recommendations on specific measures to achieve compliance. 
  4. Very large online platforms receiving an audit report that is not positive shall take due account of any operational recommendations addressed to them with a view to take the necessary measures to implement them. They shall, within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances of non-compliance identified. 

Article 29
Recommender systems 
  1. Very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including at least one option which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. 
  2. Where several options are available pursuant to paragraph 1, very large online platforms shall provide an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. 

Article 30
Additional online advertising transparency 
  1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. 
  2. The repository shall include at least all of the following information: 
    1. the content of the advertisement; 
    2. the natural or legal person on whose behalf the advertisement is displayed; 
    3. the period during which the advertisement was displayed; 
    4. whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; 
    5. the total number of recipients of the service reached and, where applicable, aggregate numbers for the group or groups of recipients to whom the advertisement was targeted specifically. 

Article 31
Data access and scrutiny 
  1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only use that data for those purposes. 
  2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification and understanding of systemic risks as set out in Article 26(1). 
  3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. 
  4. In order to be vetted, researchers shall be affiliated with academic institutions, be independent from commercial interests, have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit and be in a capacity to preserve the specific data security and confidentiality requirements corresponding to each request. 
  5. The Commission shall, after consulting the Board, adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. 
  6. Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: 
    1. it does not have access to the data; 
    2. giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. 
  7. Requests for amendment pursuant to point (b) of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided to the requested data or other data which are appropriate and sufficient for the purpose of the request. 
  • The Digital Services Coordinator of establishment or the Commission shall decide upon the request for amendment within 15 days and communicate to the very large online platform its decision and, where relevant, the amended request and the new time period to comply with the request. 

Article 32
Compliance officers 
  1. Very large online platforms shall appoint one or more compliance officers responsible for monitoring their compliance with this Regulation. 
  2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. 
  3. Compliance officers shall have the following tasks: 
    1. cooperating with the Digital Services Coordinator of establishment and the Commission for the purpose of this Regulation; 
    2. organising and supervising the very large online platform’s activities relating to the independent audit pursuant to Article 28;
    3. informing and advising the management and employees of the very large online platform about relevant obligations under this Regulation; 
    4. monitoring the very large online platform’s compliance with its obligations under this Regulation. 
  4. Very large online platforms shall take the necessary measures to ensure that the compliance officers can perform their tasks in an independent manner. 
  5. Very large online platforms shall communicate the name and contact details of the compliance officer to the Digital Services Coordinator of establishment and the Commission. 
  6. Very large online platforms shall support the compliance officer in the performance of his or her tasks and provide him or her with the resources necessary to adequately carry out those tasks. The compliance officer shall directly report to the highest management level of the platform. 

Article 33
Transparency reporting obligations for very large online platforms 
  1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. 
  2. In addition to the reports provided for in Article 13, very large online platforms shall make publicly available and transmit to the Digital Services Coordinator of establishment and the Commission, at least once a year and within 30 days following the adoption of the audit implementing report provided for in Article 28(4): 
    1. a report setting out the results of the risk assessment pursuant to Article 26; 
    2. the related risk mitigation measures identified and implemented pursuant to Article 27; 
    3. the audit report provided for in Article 28(3); 
    4. the audit implementation report provided for in Article 28(4). 
  3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. 

SECTION 5
OTHER PROVISIONS CONCERNING DUE DILIGENCE OBLIGATIONS 

Article 34
Standards 
  1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies at least for the following:
    1. electronic submission of notices under Article 14; 
    2. electronic submission of notices by trusted flaggers under Article 19, including through application programming interfaces; 
    3. specific interfaces, including application programming interfaces, to facilitate compliance with the obligations set out in Articles 30 and 31; 
    4. auditing of very large online platforms pursuant to Article 28; 
    5. interoperability of the advertisement repositories referred to in Article 30(2); 
    6. transmission of data between advertising intermediaries in support of transparency obligations pursuant to points (b) and (c) of Article 24. 
  2. The Commission shall support the update of the standards in the light of technological developments and the behaviour of the recipients of the services in question. 

Article 35
Codes of conduct 
  1. The Commission and the Board shall encourage and facilitate the drawing up of codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular on competition and the protection of personal data. 
  2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as civil society organisations and other interested parties, to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. 
  3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, contain key performance indicators to measure the achievement of those objectives and take due account of the needs and interests of all interested parties, including citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. 
  4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. 
  5. The Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain.

Article 36
Codes of conduct for online advertising 
  1. The Commission shall encourage and facilitate the drawing up of codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency in online advertising beyond the requirements of Articles 24 and 30. 
  2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of personal data. The Commission shall aim to ensure that the codes of conduct address at least: 
    1. the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24; 
    2. the transmission of information held by providers of online advertising intermediaries to the repositories pursuant to Article 30. 
  3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. 

Article 37
Crisis protocols 
  1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. 
  2. The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: 
    1. displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level; 
    2. ensuring that the point of contact referred to in Article 10 is responsible for crisis management; 
    3. where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 14, 17, 19, 20 and 27 to the needs created by the crisis situation. 
  3. The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols. 
  4. The Commission shall aim to ensure that the crisis protocols set out clearly all of the following: 
    1. the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues; 
    2. the role of each participant and the measures they are to put in place in preparation and once the crisis protocol has been activated; 
    3. a clear procedure for determining when the crisis protocol is to be activated; 
    4. a clear procedure for determining the period during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned; 
    5. safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination; 
    6. a process to publicly report on any measures taken, their duration and their outcomes, upon the termination of the crisis situation. 
  5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it may request the participants to revise the crisis protocol, including by taking additional measures. 

Chapter IV

Implementation, cooperation, sanctions and enforcement 

SECTION 1
COMPETENT AUTHORITIES AND NATIONAL DIGITAL SERVICES COORDINATORS 

Article 38
Competent authorities and Digital Services Coordinators 
  1. Member States shall designate one or more competent authorities as responsible for the application and enforcement of this Regulation (‘competent authorities’). 
  2. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. 
  • For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator.
  • Where a Member State designates more than one competent authority in addition to the Digital Services Coordinator, it shall ensure that the respective tasks of those authorities and of the Digital Services Coordinator are clearly defined and that they cooperate closely and effectively when performing their tasks. The Member State concerned shall communicate the name of the other competent authorities as well as their respective tasks to the Commission and the Board. 
  1. Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of this Regulation. 
  • Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. 
  1. The requirements applicable to Digital Services Coordinators set out in Articles 39, 40 and 41 shall also apply to any other competent authorities that the Member States designate pursuant to paragraph 1. 

Article 39
Requirements for Digital Services Coordinators 
  1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. 
  2. When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party. 
  3. Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law. 

Article 40
Jurisdiction 
  1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III and IV of this Regulation. 
  2. A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of Chapters III and IV, be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established. 
  3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected.
  4. Paragraphs 1, 2 and 3 are without prejudice to the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3. 

Article 41
Powers of Digital Services Coordinators 
  1. Where needed for carrying out their tasks, Digital Services Coordinators shall have at least the following powers of investigation, in respect of conduct by providers of intermediary services under the jurisdiction of their Member State: 
    1. the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period; 
    2. the power to carry out on-site inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium; 
    3. the power to ask any member of staff or representative of those providers or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers. 
  2. Where needed for carrying out their tasks, Digital Services Coordinators shall have at least the following enforcement powers, in respect of providers of intermediary services under the jurisdiction of their Member State: 
    1. the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding; 
    2. the power to order the cessation of infringements and, where appropriate, to impose remedies proportionate to the infringement and necessary to bring the infringement effectively to an end; 
    3. the power to impose fines in accordance with Article 42 for failure to comply with this Regulation, including with any of the orders issued pursuant to paragraph 1; 
    4. the power to impose a periodic penalty payment in accordance with Article 42 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this paragraph or for failure to comply with any of the orders issued pursuant to paragraph 1; 
    5. the power to adopt interim measures to avoid the risk of serious harm. 
  • As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those others persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities. 
  1. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: 
    1. require the management body of the providers, within a reasonable time period, to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken; 
    2. where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. 
  1. The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 65, prior to submitting the request referred to in point (b) of the first subparagraph, invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned. 
  2. The restriction shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority. The Digital Services Coordinator shall only extend the period where it considers, having regard to the rights and interests of all parties affected by the restriction and all relevant circumstances, including any information that the provider, the addressee or addressees and any other third party that demonstrated a legitimate interest may provide to it, that both of the following conditions have been met: 
    1. the provider has failed to take the necessary measures to terminate the infringement; 
    2. the temporary restriction does not unduly restrict access to lawful information by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist. 
  • Where the Digital Services Coordinator considers that those two conditions have been met but it cannot further extend the period pursuant to the third subparagraph, it shall submit a new request to the competent judicial authority, as referred to in point (b) of the first subparagraph. 
  1. The powers listed in paragraphs 1, 2 and 3 are without prejudice to Section 3. 
  2. The measures taken by the Digital Services Coordinators in the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary services concerned where relevant. 
  3. Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to adequate safeguards laid down in the applicable national law in conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties. 

Article 42
Penalties 
  1. Member States shall lay down the rules on penalties applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are implemented in accordance with Article 41. 
  2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. 
  3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% of the annual income or turnover of the provider concerned. 
  4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. 

Article 43
Right to lodge a complaint 

Recipients of the service shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. 

Article 44
Activity reports 
  1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports available to the public, and shall communicate them to the Commission and to the Board. 
  2. The annual report shall include at least the following information: 
    1. the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; 
    2. the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9. 
  3. Where a Member State has designated several competent authorities pursuant to Article 38, it shall ensure that the Digital Services Coordinator draws up a single report covering the activities of all competent authorities and that the Digital Services Coordinator receives all relevant information and support needed to that effect from the other competent authorities concerned. 

Article 45
Cross-border cooperation among Digital Services Coordinators 
  1. Where a Digital Services Coordinator has reasons to suspect that a provider of an intermediary service, not under the jurisdiction of the Member State concerned, infringed this Regulation, it shall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. 
  1. Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. 
  1. A request or recommendation pursuant to paragraph 1 shall at least indicate: 
    1. the point of contact of the provider of the intermediary services concerned as provided for in Article 10; 
    2. a description of the relevant facts, the provisions of this Regulation concerned and the reasons why the Digital Services Coordinator that sent the request, or the Board, suspects that the provider infringed this Regulation; 
    3. any other information that the Digital Services Coordinator that sent the request, or the Board, considers relevant, including, where appropriate, information gathered on its own initiative or suggestions for specific investigatory or enforcement measures to be taken, including interim measures. 
  2. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. 
  3. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. 
  4. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4. 
  5. The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board. 
  6. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. 

Article 46
Joint investigations and requests for Commission intervention 
  1. Digital Services Coordinators may participate in joint investigations, which may be coordinated with the support of the Board, with regard to matters covered by this Regulation, concerning providers of intermediary services operating in several Member States. 
  • Such joint investigations are without prejudice to the tasks and powers of the participating Digital Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation. 
  1. Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the Commission to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the Commission to intervene. 

SECTION 2
EUROPEAN BOARD FOR DIGITAL SERVICES 

Article 47
European Board for Digital Services 
  1. An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established. 
  2. The Board shall advise the Digital Services Coordinators and the Commission in accordance with this Regulation to achieve the following objectives: 
    1. Contributing to the consistent application of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation; 
    2. coordinating and contributing to guidance and analysis of the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation; 
    3. assisting the Digital Services Coordinators and the Commission in the supervision of very large online platforms. 

Article 48
Structure of the Board 
  1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. 
  2. Each Member State shall have one vote. The Commission shall not have voting rights. 

    The Board shall adopt its acts by simple majority. 
  3. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance the tasks of the Board pursuant to this Regulation and with its rules of procedure. 
  4. The Commission shall provide administrative and analytical support for the activities of the Board pursuant to this Regulation. 
  5. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation publicly available. 
  6. The Board shall adopt its rules of procedure, following the consent of the Commission. 

Article 49
Tasks of the Board 
  1. Where necessary to meet the objectives set out in Article 47(2), the Board shall in particular: 
    1. support the coordination of joint investigations; 
    2. support the competent authorities in the analysis of reports and results of audits of very large online platforms to be transmitted pursuant to this Regulation; 
    3. issue opinions, recommendations or advice to Digital Services Coordinators in accordance with this Regulation; 
    4. advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation; 
    5. support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts as provided for in this Regulation, as well as the identification of emerging issues, with regard to matters covered by this Regulation. 
  2. Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate. 

SECTION 3
SUPERVISION, INVESTIGATION, ENFORCEMENT AND MONITORING IN RESPECT OF VERY LARGE ONLINE PLATFORMS 

Article 50
Enhanced supervision for very large online platforms 
  1. Where the Digital Services Coordinator of establishment adopts a decision finding that a very large online platform has infringed any of the provisions of Section 4 of Chapter III, it shall make use of the enhanced supervision system laid down in this Article. It shall take utmost account of any opinion and recommendation of the Commission and the Board pursuant to this Article. 
  • The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. 
  1. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, where appropriate, participation in a code of conduct as provided for in Article 35.
  2. Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Digital Services Coordinator of establishment. Within one month following receipt of that opinion, that Digital Services Coordinator shall decide whether the action plan is appropriate to terminate or remedy the infringement. 
  1. Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2). 
  1. The Digital Services Coordinator of establishment shall communicate to the Commission, the Board and the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable: 
    1. within one month from the receipt of the audit report referred to in the second subparagraph of paragraph 3, where such an audit was performed; 
    2. within three months from the decision on the action plan referred to in the first subparagraph of paragraph 3, where no such audit was performed; 
    3. immediately upon the expiry of the time period set out in paragraph 2, where that platform failed to communicate the action plan within that time period. 
  • Pursuant to that communication, the Digital Services Coordinator of establishment shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission. 

Article 51
Intervention by the Commission and opening of proceedings 
  1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: 
    1. is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures, pursuant to the request of the Commission referred to in Article 45(7), upon the expiry of the time period set in that request; 
    2. is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment requested the Commission to intervene in accordance with Article 46(2), upon the reception of that request;
    3. has been found to have infringed any of the provisions of Section 4 of Chapter III, upon the expiry of the relevant time period for the communication referred to in Article 50(4). 
  2. Where the Commission decides to initiate proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. 
  • As regards points (a) and (b) of paragraph 1, pursuant to that notification, the Digital Services Coordinator of establishment concerned shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission. 
  1. The Digital Services Coordinator referred to in Articles 45(7), 46(2) and 50(1), as applicable, shall, without undue delay upon being informed, transmit to the Commission: 
    1. any information that that Digital Services Coordinator exchanged relating to the infringement or the suspected infringement, as applicable, with the Board and with the very large online platform concerned; 
    2. the case file of that Digital Services Coordinator relating to the infringement or the suspected infringement, as applicable; 
    3. any other information in the possession of that Digital Services Coordinator that may be relevant to the proceedings initiated by the Commission. 
  2. The Board, and the Digital Services Coordinators making the request referred to in Article 45(1), shall, without undue delay upon being informed, transmit to the Commission any information in their possession that may be relevant to the proceedings initiated by the Commission. 

Article 52
Requests for information 
  1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple request or by decision require the very large online platforms concerned, as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. 
  2. When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which the information is to be provided, and the penalties provided for in Article 59 for supplying incorrect or misleading information. 
  3. Where the Commission requires the very large online platform concerned or other person referred to in Article 52(1) to supply information by decision, it shall state the legal basis and the purpose of the request, specify what information is required and set the time period within which it is to be provided. It shall also indicate the penalties provided for in Article 59 and indicate or impose the periodic penalty payments provided for in Article 60. It shall further indicate the right to have the decision reviewed by the Court of Justice of the European Union. 
  4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading. 
  5. At the request of the Commission, the Digital Services Coordinators and other competent authorities shall provide the Commission with all necessary information to carry out the tasks assigned to it under this Section. 

Article 53
Power to take interviews and statements 

In order to carry out the tasks assigned to it under this Section, the Commission may interview any natural or legal person which consents to being interviewed for the purpose of collecting information, relating to the subject-matter of an investigation, in relation to the suspected infringement or infringement, as applicable. 

Article 54
Power to conduct on-site inspections 
  1. In order to carry out the tasks assigned to it under this Section, the Commission may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1). 
  2. On-site inspections may also be carried out with the assistance of auditors or experts appointed by the Commission pursuant to Article 57(2). 
  3. During on-site inspections the Commission and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). 
  4. The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union. 

Article 55
Interim measures 
  1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, the Commission may, by decision, order interim measures against the very large online platform concerned on the basis of a prima facie finding of an infringement. 
  2. A decision under paragraph 1 shall apply for a specified period of time and may be renewed in so far this is necessary and appropriate. 

Article 56
Commitments 
  1. If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action. 
  2. The Commission may, upon request or on its own initiative, reopen the proceedings: 
    1. where there has been a material change in any of the facts on which the decision was based; 
    2. where the very large online platform concerned acts contrary to its commitments; or 
    3. where the decision was based on incomplete, incorrect or misleading information provided by the very large online platform concerned or other person referred to in Article 52(1). 
  3. Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision when concluding the proceedings. 

Article 57
Monitoring actions 
  1. For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. 
  2. The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors to assist the Commission in monitoring compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the Commission. 

Article 58
Non-compliance 
  1. The Commission shall adopt a non-compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following: 
    1. the relevant provisions of this Regulation; 
    2. interim measures ordered pursuant to Article 55; 
    3. commitments made binding pursuant to Article 56, 
  2. Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings. 
  3. In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period and to provide information on the measures that that platform intends to take to comply with the decision. 
  4. The very large online platform concerned shall provide the Commission with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation. 
  5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. 

Article 59
Fines 
  1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% of its total turnover in the preceding financial year where it finds that that platform, intentionally or negligently: 
    1. infringes the relevant provisions of this Regulation; 
    2. fails to comply with a decision ordering interim measures under Article 55; or 
    3. fails to comply with a voluntary measure made binding by a decision pursuant to Articles 56. 
  2. The Commission may by decision impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% of the total turnover in the preceding financial year, where they intentionally or negligently: 
    1. supply incorrect, incomplete or misleading information in response to a request pursuant to Article 52 or, when the information is requested by decision, fail to reply to the request within the set time period; 
    2. fail to rectify within the time period set by the Commission, incorrect, incomplete or misleading information given by a member of staff, or fail or refuse to provide complete information; 
    3. refuse to submit to an on-site inspection pursuant to Article 54. 
  3. Before adopting the decision pursuant to paragraph 2, the Commission shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1). 
  4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.

Article 60
Periodic penalty payments 
  1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: 
    1. supply correct and complete information in response to a decision requiring information pursuant to Article 52; 
    2. submit to an on-site inspection which it has ordered by decision pursuant to Article 54; 
    3. comply with a decision ordering interim measures pursuant to Article 55(1); 
    4. comply with commitments made legally binding by a decision pursuant to Article 56(1); 
    5. comply with a decision pursuant to Article 58(1). 
  2. Where the very large online platform concerned or other person referred to in Article 52(1) has satisfied the obligation which the periodic penalty payment was intended to enforce, the Commission may fix the definitive amount of the periodic penalty payment at a figure lower than that which would arise under the original decision. 

Article 61
Limitation period for the imposition of penalties 
  1. The powers conferred on the Commission by Articles 59 and 60 shall be subject to a limitation period of five years. 
  2. Time shall begin to run on the day on which the infringement is committed. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases. 
  3. Any action taken by the Commission or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following: 
    1. requests for information by the Commission or by a Digital Services Coordinator; 
    2. on-site inspection; 
    3. the opening of a proceeding by the Commission pursuant to Article 51(2). 
  4. Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period is suspended pursuant to paragraph 5.
  5. The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the Commission is the subject of proceedings pending before the Court of Justice of the European Union. 

Article 62
Limitation period for the enforcement of penalties 
  1. The power of the Commission to enforce decisions taken pursuant to Articles 59 and 60 shall be subject to a limitation period of five years. 
  2. Time shall begin to run on the day on which the decision becomes final. 
  3. The limitation period for the enforcement of penalties shall be interrupted: 
    1. by notification of a decision varying the original amount of the fine or periodic penalty payment or refusing an application for variation; 
    2. by any action of the Commission, or of a Member State acting at the request of the Commission, designed to enforce payment of the fine or periodic penalty payment. 
  4. Each interruption shall start time running afresh. 
  5. The limitation period for the enforcement of penalties shall be suspended for so long as: 
    1. time to pay is allowed; 
    2. enforcement of payment is suspended pursuant to a decision of the Court of Justice of the European Union. 

Article 63
Right to be heard and access to the file 
  1. Before adopting a decision pursuant to Articles 58(1), 59 or 60, the Commission shall give the very large online platform concerned or other person referred to in Article 52(1) the opportunity of being heard on: 
    1. preliminary findings of the Commission, including any matter to which the Commission has taken objections; and 
    2. measures that the Commission may intend to take in view of the preliminary findings referred to point (a). 
  2. The very large online platform concerned or other person referred to in Article 52(1) may submit their observations on the Commission’s preliminary findings within a reasonable time period set by the Commission in its preliminary findings, which may not be less than 14 days. 
  3. The Commission shall base its decisions only on objections on which the parties concerned have been able to comment. 
  4. The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the Commission’s file under the terms of a negotiated disclosure, subject to the legitimate interest of the very large online platform concerned or other person referred to in Article 52(1) in the protection of their business secrets. The right of access to the file shall not extend to confidential information and internal documents of the Commission or Member States’ authorities. In particular, the right of access shall not extend to correspondence between the Commission and those authorities. Nothing in this paragraph shall prevent the Commission from disclosing and using information necessary to prove an infringement. 
  5. The information collected pursuant to Articles 52, 53 and 54 shall be used only for the purpose of this Regulation. 
  6. Without prejudice to the exchange and to the use of information referred to in Articles 51(3) and 52(5), the Commission, the Board, Member States’ authorities and their respective officials, servants and other persons working under their supervision,; and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 57(2) shall not disclose information acquired or exchanged by them pursuant to this Section and of the kind covered by the obligation of professional secrecy. 

Article 64
Publication of decisions 
  1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed. 
  2. The publication shall have regard to the rights and legitimate interests of the very large online platform concerned, any other person referred to in Article 52(1) and any third parties in the protection of their confidential information. 

Article 65
Requests for access restrictions and cooperation with national courts 
  1. Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission may request the Digital Services Coordinator of establishment of the very large online platform concerned to act pursuant to Article 41(3). 
  • Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures it intends to request and identifying the intended addressee or addressees thereof. 
  1. Where the coherent application of this Regulation so requires, the Commission, acting on its own initiative, may submit written observations to the competent judicial authority referred to Article 41(3). With the permission of the judicial authority in question, it may also make oral observations. 
  • For the purpose of the preparation of its observations only, the Commission may request that judicial authority to transmit or ensure the transmission to it of any documents necessary for the assessment of the case.

Article 66
Implementing acts relating to Commission intervention 
  1. In relation to the Commission intervention covered by this Section, the Commission may adopt implementing acts concerning the practical arrangements for: 
    1. the proceedings pursuant to Articles 54 and 57; 
    2. the hearings provided for in Article 63; 
    3. the negotiated disclosure of information provided for in Article 63. 
  2. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. Before the adoption of any measures pursuant to paragraph 1, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than one month. 

SECTION 4
COMMON PROVISIONS ON ENFORCEMENT 

Article 67
Information sharing system 
  1. The Commission shall establish and maintain a reliable and secure information sharing system supporting communications between Digital Services Coordinators, the Commission and the Board. 
  2. The Digital Services Coordinators, the Commission and the Board shall use the information sharing system for all communications pursuant to this Regulation. 
  3. The Commission shall adopt implementing acts laying down the practical and operational arrangements for the functioning of the information sharing system and its interoperability with other relevant systems. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. 

Article 68
Representation 

Without prejudice to Directive 2020/XX/EU of the European Parliament and of the Council52, recipients of intermediary services shall have the right to mandate a body, organisation or association to exercise the rights referred to in Articles 17, 18 and 19 on their behalf, provided the body, organisation or association meets all of the following conditions: 

  • it operates on a not-for-profit basis; 
  • it has been properly constituted in accordance with the law of a Member State; 
  • its statutory objectives include a legitimate interest in ensuring that this Regulation is complied with.

SECTION 5
DELEGATED ACTS 

Article 69
Exercise of the delegation 
  1. The power to adopt delegated acts is conferred on the Commission subject to the conditions laid down in this Article. 
  2. The delegation of power referred to in Articles 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time from [date of expected adoption of the Regulation]. 
  3. The delegation of power referred to in Articles 23, 25 and 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force. 
  4. As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council. 
  5. A delegated act adopted pursuant to Articles 23, 25 and 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council. 

Article 70
Committee 
  1. The Commission shall be assisted by the Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011. 
  2. Where reference is made to this Article, Article 4 of Regulation (EU) No 182/2011 shall apply. 

Chapter V

Final provisions 

Article 71
Deletion of certain provisions of Directive 2000/31/EC 
  1. Articles 12 to 15 of Directive 2000/31/EC shall be deleted. 
  2. References to Articles 12 to 15 of Directive 2000/31/EC shall be construed as references to Articles 3, 4, 5 and 7 of this Regulation, respectively.

Article 72
Amendments to Directive 2020/XX/EC on Representative Actions for the Protection of the Collective Interests of Consumers 
  1. The following is added to Annex I: 
  • “(X) Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC” 

Article 73
Evaluation 
  1. By five years after the entry into force of this Regulation at the latest, and every five years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. 
  2. For the purpose of paragraph 1, Member States and the Board shall send information on the request of the Commission. 
  3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources. 
  4. By three years from the date of application of this Regulation at the latest, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board and shall report it to the European Parliament, the Council and the European Economic and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure of the Board. 

Article 74
Entry into force and application 
  1. This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union. 
  2. It shall apply from [date – three months after its entry into force]. 

This Regulation shall be binding in its entirety and directly applicable in all Member States. 

Done at Brussels, 

For the European Parliament For the Council 

The President The President 

This initiative was launched by: MEPs Patrick Breyer, Alexandra Geese, Kim van Sparrentak, Marcel Kolaja, Rasmus Andresen, Damian Boeselager and Mikulas Peksa.

Leave a general remark

Leave a Reply to Maryant (BEUC) Cancel reply

Your email address will not be published. Required fields are marked *

229 Remarks

  • General Remarks

  • Inline Remarks

  • 8
    0

    Position of the European Broadcasting Union on the Digital Services Act

    For members of the European Broadcasting Union (EBU) – public service media organizations in Europe – establishing a strong and consistent rulebook for global online platforms is crucial. We support the objective of the DSA to create a safer and more accountable online environment. However, we ask EU decision-makers to make the proposed DSA stronger. It must reflect the significant influence of online platforms on access to content and information and on opinion-making. Citizens need a stronger DSA to continue to engage online with the media they most trust and value.

    Online platforms have become important ways to reach audiences. Citizens use social networks, news aggregators or search engines to access news, information and other media content. Beyond their own digital channels and services, public service media offer their diverse content and information on online platforms and use the innovative opportunities the platforms provide to reach and interact with their audiences. But today the platforms determine who sees what and when – based on their algorithms, content recommendation systems, community standards and terms and conditions. And contrary to independent media, platforms remain largely unaccountable to the public.

    The DSA promises to create a safer, more accountable online environment through obligations for platforms to act against illegal content and by empowering platform users in offering them more transparency, traceability and better reporting systems. We support these important objectives and welcome strengthening the proposed obligations (e.g. extension of the ‘know your business user’ obligation to all online platforms). However, for the DSA to enable all citizens to have continued access to the trusted news, information and the rich plurality of views that media offers, EU policy makers must, however, act on the following:

    Protect editorial media content and services from interference by online platforms

    Media service providers abide by strict EU and national media laws, are guided by professional editorial standards and are subject to specific regulatory oversight – no matter where or how their content and services are consumed. However, lawful media content regularly gets removed and apps and accounts blocked by online platforms without any prior warning (e.g. children’s’ content and applications, satirical content, current affairs and news programmes).

    To ensure the effectiveness of European and national media rules as well as the editorial independence of media in Europe, platform operators should not be allowed to exercise any control over programmes from media service providers or otherwise interfere with them once they are available on these platforms. The DSA should ensure that media organisations remain solely responsible for the content and services they produce.
    As media service providers maintain full editorial responsibility over their content, platform operators should neither be responsible nor liable for the content offered by media service providers on their platforms. This is key to balance out the freedom to conduct business of platform operators on the one hand and media service providers on the other. It will ultimately serve to foster public trust in journalism and media online.

    Ensure proper brand attribution of editorial media content

    When audiences access media content through social networks, news aggregators, or search engines, they need to be able to easily identify who bears the editorial responsibility over it. Failure by platforms to attribute content to its source or incorrect attribution of logos and branding deprives audiences of an essential element to judge the information they see and hear. The DSA should oblige platforms to ensure that the identity of media organisations, alike other business users, is clearly visible (e.g. logos/branding).

    Secure the application of sector-specific EU and national laws

    The DSA sets harmonized and horizontal standards for a wide range of online platforms. This could result in overlaps with certain sector-specific EU and national laws and limit Member States’ competence to regulate cultural issues in relation to intermediary service providers. It should therefore be clarified that the DSA applies without affecting existing and future sectorial measures as well as those which serve to promote cultural diversity as well as media freedom and pluralism online.

    Enhance the transparency of platforms’ content recommendations

    The DSA must set high transparency standards on all online platforms regarding algorithmic decision-making processes and content recommendation. It is essential for media organizations and media users to better understand how platforms’ recommender systems affect the visibility, accessibility and availability of content and services online and to be able to tailor their services accordingly. As recommender systems are commonly used by all types of platforms, enhanced transparency rules should apply to all platforms and not be limited to very large platforms.

    Ensure proper enforcement and oversight

    National regulatory authorities for the media play a vital role to ensure media pluralism and safeguard freedom of expression and information. Member States should therefore make sure that the national regulatory authorities and bodies for the media are adequately involved in enforcement and oversight of the DSA.

    Reply
  • 3
    0

    I am concerned about the continued collection of personal data by large internet companies for the purpose of targeted advertising. Apparently, the GDPR does not provide sufficient leverage to take action against this violation of data protection. Therefore, I had hoped that the DSA would put a stop to this abuse. Apparently, however, this is not provided for in the present draft of the Commission, in any case it is not ruled out by the transparency provisions in Articles 24 and 30. I therefore urgently appeal to the Parliament to design the DSA in such a way that the collection of personal data is strictly prohibited. (Cheers for the adszuck and trackingfreeads initiatives!)

    Reply
  • 1
    1

    Regarding cyber-based violence against women in the digital space:

    In order for the DSA to a) create a safer digital space in which the fundamental rights of all users of digital services are protected; and b) establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally:
    • Online abuse needs to be recognised as disproportionately affecting women and through an intersectional lens that recognises that Black and minorities communities are disproportionately affected by online abuse.
    • It needs to be seen as part of a continuum of Violence Against Women.
    • Digital Citizenship Education as part of a media literacy agenda is key to prevention alongside rules placed on tech companies

    Reply
  • 0
    0

    ITI Views on the European Commission Proposal for a Digital Services Act (DSA)

    Introduction

    Digital services play a foundational role in driving innovation and growth in the economy, supporting the smooth operation of digital supply chains and creating market opportunities and access for businesses of all sizes. Policymakers around the world are grappling with real challenges caused by the scale, speed, and complexity of various types of digital intermediaries, the roles they play regarding content and activities online, and in some contexts, their ability to shape public opinion. At ITI, representing all the segments of the tech industry, we understand and recognise the shared responsibility to maintain a safe, inclusive, and innovative online environment.

    We support the goals of the European Commission’s Digital Services Act to increase legal certainty, clarify roles, and define responsibilities for actors in the online context, i.e. by reviewing and bringing more clarity to the framework. In the following, we provide recommendations for a balanced and proportionate approach that combines regulatory scrutiny with appropriate rights for all actors in the Internet ecosystem.

    General Comments

    ITI welcomes the DSA and supports its ambition to create a more secure and transparent online space for all actors involved in the online ecosystem by introducing new obligations and rights for different actors in the online sphere. The differentiation between types of services and their impact is necessary to create a level playing field for all actors online and recognises the diversity of the online ecosystem while ensuring safety online for European citizens. Specifically, the proposal clearly differentiates between different types of digital intermediaries, such as mere conduit, caching and hosting services providers. Proportionate and risk-based rules that are targeted to different types of services are especially important when considering that many companies may not have the ability or right (technical, contractual, or otherwise) to edit or manage content.

    The proposal also clearly differentiates between responsibilities for smaller versus larger players. Proportionality is key to avoid unnecessary burdens and risk stifling innovation and growth of all companies and especially emergent players. While the scale of platforms is an important factor, size alone does not fully reflect the risk inherent to each platform, other factors such as impact, vulnerability of the business model to abuse, and demonstrated systemic exposure to illegal activities/content should also play a role in determining additional specific obligations. Many services already have systems in place to address the needs of their customers and meet expectations of governments and civil societies regarding content moderation. Such systems should be used to inform requirements and obligations should seek to be complementary to these existing systems. Platforms, especially large platforms subject to additional requirements, should maintain the ability to implement these requirements in a way that best reflects the nature of their services, the type of content they make available, and their risk exposure for users on their platform.

    Several important provisions leave critical definitions and methodologies to delegated acts. These are too central to the operation of the DSA to be left to delegated acts that are not subject to the legislative process and prevent participation of Parliament and Council as well as other stakeholders. The references to delegated acts in Articles 23, 25 and the general outline in Section 5 and the methodologies they seek to specify should constitute an integral part of the regulation instead.

    Relatedly, the lack of clear definitions makes it impossible to understand which companies and products will be subject to which requirements. In particular, we believe that the criteria used to define what constitutes a VLOP, how active users are identified, and the additional obligations associated with this status, would need to be defined in the law and must not be left to the Commission to decide via delegated acts. The diversity of the digital ecosystem has also produced diversity of users and companies’ interactions with users. Due consideration should be given to how users are counted, including whether it is based on registered users or guest visits.

    The Commission’s proposed enforcement framework resembles existing enforcement structures for other digital legislation, including the General Data Protection Regulation (GDPR), but seems to diverge in several respects. We propose adding more clarity on which authorities can undertake enforcement activities, in what circumstances, and the relevant due process protections, highlighting the need to create clear pathways. The oversight and enforcement regime should not undermine the country-of-origin principle which remains a key pillar to the functioning of the internal market. In addition, we would welcome clarifications on the methodology to calculate fines.

    Lastly, given the importance of this initiative, we want to highlight the need for all stakeholders to be able to feed into the legislative process. We appreciate the sense of urgency to make progress, though, we urge the co-legislators to take time to get it right.

    Specific Commentary on the Proposal
    Articles 1-2, 11: Extraterritorial services
    We welcome the approach that rules will apply to providers of intermediary services irrespective of their place of establishment or residence, in so far as they provide services in the EU. However, the text of the proposal represents new and distinct challenges if it is to be pursued in the current form. Given the requirement to have a representative in Europe, as well as consideration for the ability of European citizens to interact with content and services based anywhere in the world, a clear set of rules for when a non-EU service must or must not comply with the DSA should be included in the regulation in order to ensure that the DSA reflects the global nature of the digital ecosystem. Another important clarification on the scope of the law is whether “operating in” means the same as “offering services” in Article 2 of the proposal.
    Article 2: Definitions
    We welcome the clarification in Article 2(g) that illegal content is any content that is not in compliance with EU law or the law of a Member State. It is essential to maintain the principle of what is illegal, regardless of whether it is online or offline.
    We also welcome the definition of online platforms, which provides greater certainty for those hosting services that do not store and disseminate information to the public and should therefore not be included in this platform category, such as B2B cloud service or IT infrastructure providers. Indeed, the potential inclusion of these B2B services would not serve the goals of the DSA, particularly where there is no direct link between the cloud service and the online dissemination of goods, services, or content to third parties. Further, a provider of such B2B enterprise or outsourced hosting services would not necessarily have legal access or control over client or user generated data or content. Many businesses rely on cloud infrastructure or IT providers to build applications, platforms or websites, yet the cloud provider is not necessarily intermediating between the business and its customers, particularly when the service in question is of technical nature.
    Articles 3-5: Clarifying rules for intermediaries
    We welcome the European Commission’s focus on restating and clarifying the liability exemptions for mere conduit, caching and hosting providers. However, more clarity on the concrete definitions of entities in Recital 27 would be welcome e.g. defining differences between mere conduit and caching services based on whether information is in transit or stored temporarily. Additional clarity on the differences between hosting services and online platforms would also be welcomed to ensure that a broad interpretation of the concept of “dissemination to the public” does not have unintended consequences. We encourage consistent application across Member States of the concept of actual knowledge in Article 5 as defined by EU case law.
    Article 6-9: Safeguarding limited liability and no general monitoring obligation
    We welcome the commitment to maintaining a limited liability scheme while providing the much-needed legal certainty that voluntary content screening does not exclude platforms from liability exemptions. We welcome that the Commission underlines that there should be no general monitoring obligations for platforms to screen content on their sites while promoting responsible actions at the same time. We also appreciate that the Commission outlines concrete conditions that Member States need to meet to issue requests addressed to platforms to act against illegal content or to provide certain information. However, Article 6 needs further clarity of what exactly constitutes actual knowledge of illegal content as referenced in Article 5. For instance, where an intermediary service provider has voluntarily reviewed content or activities for a certain type of specific unlawfulness (or for a certain type of specific violation of its community guidelines), the service provider is not necessarily deemed to have knowledge of any other ways in which the reviewed content or activities might be unlawful. ITI continues to recommend this clarification. In addition, we would further welcome a clarification that the protection of Article 6 is extended to voluntary investigations or other activities aimed at detecting, identifying and removing, or disabling access to content that violates intermediaries’ terms and conditions, by either automated or non-automated means.
    With regards to orders to act against illegal content or to provide information, as specified in Articles 8 and 9, we encourage the proposal to include a list of contacts per Member State or provide alternative ways to simplify intake and prioritisation of such requests.
    Article 10: Single point of contact (SPOC)
    We appreciate the Commission proposal’s goal to establish fast and easy communication between national authorities and service providers. We agree that having formalised and publicised communication channels is the right approach, however we ask for flexibility in implementing this requirement to account for differences in how companies are internally organised. For example, a single point of contact for both Member State authorities and trusted flaggers may in fact slow down response by the intermediary. Having the option for separate teams dedicated to those stakeholders may make more sense in practice, depending on the volume and nature of notices the intermediary receives. We also believe that flexibility should be implemented when it comes to designating the points of contact in the intermediaries, and companies should be able to designate team members for this role without necessarily having to hire new staff. This SPOC should be available to DSCs, national authorities, and trusted flaggers, but not the wider public, which should use other designated channels for reporting.
    SPOCs already exist today in the area of law enforcement, and they have proven to be beneficial for all parties. They help build trust and communication between services and authorities resulting in a far more efficient and streamlined collaboration. For the various SPOCs envisioned within a company, these contacts need to reflect the different teams and workflows that different issues and obligations will be dealt with, and flexibility in designating the single points of contact will be important.
    Article 12: Content moderation references in terms and conditions
    We are concerned that including information on content moderation in the terms and conditions could impact contractual liability, as acknowledged in recital 38, and create unintended claims for breach of contract under national civil law, in addition to the compliance and sanctioning regime established by the DSA. We believe companies should have the ability to list these separately or that the rationale for including them in the terms and conditions should be further clarified.
    We also seek clarifications that the level of detail required under Article 12 is such that will not allow bad faith actors to circumvent intermediaries’ content moderation systems.
    Article 13, 33: Transparency reporting for illegal content moderation
    Transparency is an important aspect of trust in services and businesses on- and offline. The Platform-to-Business Regulation and consumer omnibus legislation provide a helpful legislative framework for identifying effective and efficient transparency tools that help users and authorities. More clarity is needed in the Regulation on what needs to be reported – i.e. take-downs on the basis of a legal order or administrative decision. We also encourage transparency regarding platforms’ policies in handling repeat infringers regarding illegal content. There could also be room for more cooperation between online platforms and public authorities to better address issues arising from repeat infringers.
    Article 14-15: Notice and action mechanisms
    We appreciate the European Commission’s goal to provide all users of intermediary services, be they business users or individuals, the ability to make use of effective electronic notice-and-action mechanisms to report illegal content. Legislators should bear in mind that a potential widening of the notice-and-action system could lead to higher volumes and potentially unfounded notices and dilute resources or takeaway focus from more meaningful cases.
    Notice formalities are important to help service providers determine the validity of requests. As different types of content may need to be acted upon differently, we caution against an approach whereby a notice that fulfils all formalities necessarily results in actual knowledge (as Article 14(3) appears to suggest). Additional detail may be necessary for platforms to determine the validity of requests and perform swift and proper action. Moreover, the DSA should acknowledge that notices should be directed in the first instance to the party with the technical and operational capability to take action against specific illegal content. Hosting service providers should have the ability, upon receipt of a notice through the mechanisms described in Article 14, re-direct the notice to the party which has the technical and operational capability to take action. We caution against the publication of all statements of reason in a public database as we believe this would not be proportionate and may not be technically feasible. Legislators should also consider additional guidance for handling repeat offenders and informing customers of illegal product sales.
    Article 17: Complaint handling systems
    We take note of the Commission’s proposal to introduce an obligation for online platforms to set up an internal complaint-handling system against decisions around take-down of content, termination of service provision or account terminations. This is an area of significant ongoing investment by services, as established by the Platform-to-Business Regulation (P2B), and we encourage the Commission to align these requirements.
    In order to meet the goal of providing such a recourse system, automated systems may be critical to fulfil this obligation, as automated systems can be more efficient, consistent, and scalable. We therefore suggest that the proposal should focus on flexibility for tools and systems that platforms may use, including enabling automated systems and avoiding specific thresholds for human operators. We also recommend limiting the time frame during which such systems remain available to users, to ensure this obligation does not impose disproportionate costs on service providers.
    Article 18: Out-of-court dispute settlement
    Out-of-court dispute settlement (OOC) is already available under a number of EU laws intersecting with the DSA, such as the Audiovisual Media Services Directive, the P2B Regulation and the EU Copyright Directive. It is not clear whether additional OOC dispute settlement mechanisms are needed. We urge the Commission to harmonise those requirements, as well as ODR and ADR bodies for business to consumer issues. Any potential new rules on out-of-court dispute settlement (OOC) should avoid prescriptive requirements around the use of alternative dispute resolution. All actors in the process should have the flexibility to respond in a proportionate way to the situation. While OOC can be a viable alternative to Court proceedings and can benefit faster resolution of conflict, there need to be safeguards against frivolous complaints and parties engaged in OOC should commit to its outcome and not launch judicial proceedings in parallel, all while having a symmetric ability to challenge it.
    Article 19-20: Trusted flagger schemes
    We welcome the Commission’s focus on innovative cooperation mechanisms between the different actors involved in detection and takedown of illegal content online. Awarding trusted flagger status should be a joint effort by the platforms, third parties, rightsholders, NGOs and state-backed groups seeking trusted flagger status, as well as the Digital Services Coordinator of the respective Member State, to ensure that expertise and experience is reflected in the process. The trusted flagger scheme should allow a service some flexibility by platforms to select trusted flagger partners and to continue to manage and prioritise notices depending on the urgency or severity of the content within the trusted flagger system. To increase efficiency of the new tool, sophisticated rights holders, with a large IP portfolio and a good track record of accuracy in reporting, should be able to qualify for trusted flagger status to confirm the authenticity of their goods.
    The proposed conditions that trusted flaggers must meet are balanced but could use further specification. For example, in the IP context, clarifications would be welcomed on what “organisations of industry” mean in a context where there are trade associations on the one hand, and IPR service providers/ agencies (e.g. REACT) on the other. In addition, the relationship between rights owners and collective rights groups needs to be clarified to be exact in explaining what loss of trusted flagger status of a collective group means for its individual members. As trusted flaggers can be relevant and practical for both IP and non-IP content, it should be explored whether it would be efficient for both online intermediaries and rightsholders if different trusted flagger systems existed for different types of content.
    In other contexts, NGOs or state-backed groups promoting safety online might seek trusted flagger status, where safeguards against potential abuse or misuse of notice systems are essential. Further clarification would be useful when it comes to requirements for trusted flaggers to demonstrate expertise and whether they would need to have a point of contact and legal representative within the Member State of the DSC that they register with.
    We welcome the possibility to withdraw the trusted flagger status if the trusted flagger continuously submits insufficiently precise, inaccurate or wrong claims, and we believe that this process would benefit from further specification.
    Lastly, the new tool being used widely by third parties, rightsholders, NGOs and other groups could lead to a surge in notices and consideration should hence be given to the number of potential trusted flaggers per online platform to ensure that processing of other notices is not slowed down.
    Article 21: Flagging serious criminal offences
    We support the concept that services should report suspicions of the most serious immediate threats to the life or safety of persons to law enforcement when they become aware of such activity. However, these circumstances need to be very clearly defined. In line with the limited liability framework, we urge that there must not be a general monitoring obligation and platforms should only be required to act if they are made aware of a situation and there is sufficient information to act. Additionally, further alignment with the conditions imposed under the Terrorist Content Online Regulation in that regard would be highly welcome.
    Article 22: Traceability of traders (Know-your-business-customer provision)
    KYBC schemes can be helpful to combat illegal content online and enhance consumer protection. Many hosting services already conduct background checks of their customers as part of their own trust and security processes. Nevertheless, while we are encouraged by the Commission’s effort in exploring traceability, we caution that the proposed obligations may in some areas need to be more clearly defined and proportionality ensured. For example, to the extent Article 22 is aimed exclusively at online marketplaces, this should be made clear. In addition, requiring online platforms to collect information about economic operators under 22(d) could be problematic, given the number of potential parties along the supply chain that this term may cover, and that this information would be required at the time of opening an account. Any approach should be harmonised and based on the collection of typical identifiers, as outlined in the European Commission’s proposal, in electronic format.
    Article 24: Online advertisement transparency
    We acknowledge the Commission’s goal to make identification of advertisements easier for consumers online. We note that already many obligations are in place to disclose information on advertisement. It is important that provisions on advertising take into account the reality of all of the advertising models and reflect the often dual roles that platforms play in this space. For example, in many instances, platforms will not have access to the data as they work with third parties.
    Article 25: Defining VLOPs
    We believe that the criteria used to define what constitutes a VLOP, and the additional obligations associated with this status, would need to be defined in the law and must not be left to the Commission to decide via delegated acts. While we agree that reach and scale play an important role, other factors such as vulnerability of the business model to abuse and demonstrated systemic exposure to illegal activities/content may also be considered when determining whether the additional specific obligations are required.
    For example, consideration should be given to the qualities of the service and how the services address serious issues of illegal content. Size is not the only relevant criterion here as often smaller online platforms can also be responsible for the impactful dissemination of illegal content.
    The proposed definition basing itself on the number of 45 million average monthly active users needs further specification to explain what constitutes an “active user” for the very different service types covered by the DSA. It should be clear if the 45 million threshold relates to the number of end users of the customers of a hosting provider, or if it relates to the number of direct customers, where the hosting provider stores and disseminates content to the public at the request of a recipient.
    A proposed revision of the definition every 6 months creates legal uncertainty that is unhelpful given the significant compliance burden that companies would encounter when falling within the scope of this definition. Instead, we would suggest reassessing the definition at most every 1-2 years. In the same vein, we encourage a grace period for new VLOPs of 12 months before they have to implement the VLOP-specific elements of the legislation. The current timeline of 4 months is too short to set up an effective compliance process. For example, reporting obligations for VLOPs such as reports needing to be submitted every 6 months require some time to get the right processes set up.
    Article 26-27: Risk management & mitigation
    The Commission proposal foresees that VLOPs need to identify systemic risks stemming from their services in the EU including dissemination of illegal content through them, negative effects on exercise of fundamental rights to privacy, freedom of expression or rights of the child, intentional manipulation of their services with actual or foreseeable negative effect on public health, minors, etc.
    Given the far-reaching nature of these obligations and the types of content that would be covered by this, we would welcome more legal certainty through, for example, a definition of what may constitute a systemic risk. We would also urge that VLOPs have flexibility over the mitigation measures that they choose to implement to address those risks, given the differences in their interactions with data and their business models. The proposed provisions have an honourable goal in mind. However, they may not be suitable for all types of VLOPs’ activities and should not extend to B2B services. In many instances, in addition, using reporting mechanisms would be more useful than annual analysis to be able to act fast, efficiently, and specific to a particular issue. Standard content moderation procedures could for example detect spikes in illegal products being sold on a website and the platform could notify the authorities and act accordingly. Policymakers should also be cautious that risk management does not result in inadvertently introducing general monitoring obligations.
    Article 28: Auditing
    We support the Commission’s goal of enhancing transparency of online platforms through the DSA and acknowledge the appropriate role of audits. However, obliging VLOPs to conduct annual independent, external audits and publish findings in an audit report may be repetitive or unduly onerous if they are already performing internal audits, without necessarily adding additional transparency or accountability. Many companies are already performing internal or external audits and making much of the required information available to stakeholders, and so we urge legislators to ensure that the Regulation sets guidelines or criteria for these audit reports, but not necessarily mandate external auditing. The General Data Protection Regulation (GDPR), for example, has shown that internal auditing can be a successful approach to creating awareness of practices within an organisation and supporting accountability for legal standards, without requiring external auditing. There are further practical considerations for example feasibility of auditing in a privacy-compliant way as well as availability of sufficiently qualified auditors capable to audit the large scope of VLOP obligations within the one-year time period envisaged in the DSA.
    Article 29: Recommender systems
    The obligations in Article 29 to set out main parameters used for their recommender systems in their T&Cs should be careful to not require companies to share any trade secrets or business-confidential information. Requirements on ranking transparency outlined in the Platform-to-Business Regulation overlap considerably with the DSA recommender systems. The Platform-to-Business Regulation states that operators “not be required to disclose algorithms or any information that, with reasonable certainty, would result in the enabling of deception of consumers or consumer harm through the manipulation of search results.” It is unclear why the DSA would not provide for the same protections. Consideration should also be given to the context of recommender systems and the risk profiles of those platforms.
    Article 30: Additional online advertising transparency obligations
    We appreciate the efforts to bring more clarity on online advertising, however we are concerned that these far-reaching requirements would impose significant new burdens on companies without necessarily achieving a particular result. As with recommender systems, consideration of the context of ads and the potential risks should be considered. For example, certain ads, such as political ads or those focused on children, may require additional transparency to understand their reach and content. Further, the value chain in the online advertisement business is quite complex and should be given consideration to account for the different players and their interactions with content and users. Transparency obligations should be placed on the actors in this value chain with the most appropriate ability to access and disclose the required information.
    Article 31: Data access and scrutiny
    We believe data access requests should relate only to making available, upon request certain, clearly defined types of data collected by VLOPs. However, there need to be clear boundaries as to who can request such data and we believe these should be limited to the Digital Services Coordinator in their Member State of establishment and to the European Commission for the purposes of enforcing this Regulation. Additional clarification is needed about the circumstances in which this should also be extended to independent academics and researchers whose research projects meet ethical and data security standards. We agree that VLOPs should be equipped with a right for due process and a right to challenge requests received. However, we believe that grounds to refuse requests should be extended to not only include unavailability of data requested or protection of trade secrets but to also include concerns about the requesting institution or academic in particular and the purposes for which it may be used. We are strongly of the view that the details on exact circumstances under which VLOPs have to share data with these groups should not be left to be decided in Delegated Acts as this is an extraordinary power and should instead be specified in the Regulation itself. Lastly, we urge flexibility in the format that data would be transferred in so as not to impose additional disproportionate burden on VLOPs.
    Article 34: Voluntary industry standards
    We support the Commission’s approach to rely on international, voluntary industry standards for notice-and-action systems, trusted flagger notices, APIs and interoperability for online advertisement transparency requirements, and data access. It is important that these be flexible and industry-driven in order to ensure compliance and efficiency. Furthermore, to ensure necessary international compatibility and alignment with a trade- and innovation-facilitative approach to European standardisation, we strongly encourage the Commission to rely on international standards.
    Article 35 & 36: Codes of Conduct
    Further to the points immediately above, we strongly support reliance on industry-driven, international standards and global best practices in the development of codes of conduct for systemic risks. We appreciate the inclusion of stakeholders in all parts of the ecosystem in the development of such codes. The DSA should include some “guardrails” that define what any code will and will not contain at a high level, for example that these will not mandate practices such as general monitoring. Due process in the development of such codes, including openness, transparency, avoidance of conflict of interest, and well-established, consensus-based voting procedures, ensure that the resulting technical standards will achieve the aim of setting appropriate requirements.
    Articles 38-70: Implementation, cooperation, sanctions and enforcement
    The Commission proposes to set up a Digital Services Coordinator for each Member State and an EU level body called European Board for Digital Services composed of a group of Digital Services Coordinators. We encourage the co-legislators to ensure that the enforcement structure does not create multiple accountabilities for a service. We do welcome the amount of detail given on the establishment processes, tasks and voting mechanisms for the European Board of Digital Services. We urge similar clarity on the tasks and objectives of the national Digital Services Coordinators and the European Commission’s accountability mechanisms and due process safeguards regarding its proposed enforcement capacity.
    However, it is sometimes unclear on what justification certain obligations are based, or what their goal is. For example, equipping the European Commission with powers to conduct on-site inspections for VLOPs in specific circumstances seems to miss the goal of obtaining explanations from VLOPs in certain situations. We would welcome clarity on the rationale behind this provision, which should then be incorporated into the text.
    We appreciate the importance of responding accurately and promptly to information requests from Digital Services Coordinators, and recognise that in many instances a degree of discussion around the request will be helpful to all parties in clarifying the information that is sought and the forms in which it can be provided. To that end, we would encourage a provision in the Regulation that identifies the benefits of such discussions and allows for good faith requests for clarification.
    We would welcome clarifications on the processes and procedural safeguards for joint investigations, and on the methodology to calculate fines, as well as a limitation of the possibility to impose fines only to situations where specific provisions of the Regulation are systematically infringed.
    Article 74: Application timeline
    We believe that having the Regulation apply from 3 months after its entry into force is not simply very ambitious but clearly unworkable and out of line with the timeframes for implementation of other significant frameworks such as GDPR, the Goods Package or the VAT reforms. The timeframe to allow companies to set up compliance structures should be extended to at least 18 months unless the proposal changes radically toward a more tailored and proportionate approach. We believe that the suggested evaluation cycle of every 5 years is no match for the fast-paced internet economy and should be reduced to every 3 years to assess if the law is still fit for purpose.

    Reply
  • 0
    0

    The legislative framework for regulating digital services and platforms must adequately protect rights of persons with disabilities. For this, both the Digital Services Act (DSA) and Digital Markets Act (DMA) must ensure:
    Accessibility: Our main demand is to ensure accessibility of intermediary services for persons with disabilities. It is important that accessibility is ensured for all users, and not only for consumers, to make sure that organisations and businesses run by or employing persons with disabilities enjoy the same rights as other users of digital services and platforms. We propose a new article under DSA Chapter III, Section 1- Provisions applicable to all providers of intermediary services highlighting accessibility. For DMA, we propose a requirement to ensure accessibility for persons with disabilities amending Article 6 – Obligations for gatekeepers susceptible of being further specified.
    Mainstreaming of accessibility throughout the legislation: in addition to the proposed article, accessibility of services, information, feedback and complaints mechanisms, dispute settling systems (Article 18, DSA), as well as reports of services (Articles 13, 23, and 33, DSA), national authorities (Article 44, DSA), and the European Commission is vital, so that this Regulation serves all EU citizens equally.
    Consistency with relevant international and Union legal frameworks: This Regulation must be consistent with other Union legislation on accessibility and equality, and be based on EU’s obligations under international human rights frameworks, namely the UN Convention on the Rights of Persons with Disabilities (UN CRPD) (see proposed new Recitals, DSA and DMA; Articles 1, DSA and DMA).
    Meaningful engagement with persons with disabilities: Involve persons with disabilities through their representative organisations in structures aimed at facilitating the implementation of the current Regulations, for example in the European Board for Digital Services (Article 47, DSA), or when drawing up codes of conduct for proper application of this Regulation (Article 35, DSA), and crisis protocols (Article 37, DSA).
    Effective data collection and reporting: Data on infringement of accessibility requirements under this Regulation should be reported by intermediary services to competent authorities, and included in the annual reports of these authorities in order to assess the effectiveness of this Regulation as regards ensuring accessibility of digital platforms and services for persons with disabilities (new Article 10, DSA).

    Context:
    As signatories to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), the European Union (EU) and all Member States are legally obliged to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. (Article 9 – accessibility).
    Accessibility is a pre-requisite for persons with disabilities to fully enjoy other rights enshrined by the UN CRPD, such as freedom of expression and opinion, and access to information (Article 21), participation in political and public life (Article 29), and participation in cultural life, recreation, leisure and sport (Article 30). The EU and Member States are also obliged to ensure rights of persons with disabilities to equality and non-discrimination (Article 5), freedom from exploitation, violence and abuse (Article 16), as well as protect the integrity of persons with disabilities (Article 17).
    Given growing importance of digital services and online platforms, especially of gatekeepers, in the lives of all persons, the protection of the above-mentioned rights and freedoms is equally important in the online domain. As we get more dependent on digital technologies, their impact on Sustainable Development Goals concerning access to education, work, healthcare, social services, housing, transport and other spheres grows. Despite this, millions of persons with disabilities in the EU still face exclusion from digital participation, which hinders their participation in mentioned areas of life. This is largely due to inaccessibility of digital technologies, including of online platforms and services. Online discrimination and hate speech experienced by many persons with disabilities further reinforces their marginalisation and exclusion from the public domain.
    Recent EU-initiatives such as the European Accessibility Act, the Web Accessibility Directive, Audiovisual Media Services Directive, and the European Electronic Communications Code have been important drivers for inclusion and participation of persons with disabilities in society, and have demonstrated that the EU is committed to meet its international human rights obligations under the UN CRPD. There are, however, still large gaps in EU accessibility and anti-discrimination legislation that would ensure full protection of rights of persons with disabilities.
    Having provided feedback to the European Commission’s public consultation on the Digital Services Act package, we are disappointed with the disregard of accessibility of digital services and platforms for persons with disabilities in EC proposals for Digital Services Act and Digital Markets Act. We strongly call on the European co-legislators to ensure accessibility of digital platforms and services, so that European law best serves the interest of all Union citizens, including of more than 100 million EU citizens with disabilities.
    As noted in our feedback to the EC public consultation and adjacent recommendations document, we reinstate our four main recommendations that are vital for making sure that the proposed legislative framework adequately protects rights of persons with disabilities in relation to digital services and online platforms. Those are:
    1. Ensuring accessibility of digital platforms and services with a universal design approach
    2. Preventing discriminatory content, including hate speech on online platforms and services
    3. Ensuring right to privacy and protection of personal data of persons with disabilities
    4. Strong enforcement mechanisms, including well-resourced EU and national regulatory authorities.

    Reply
  • 0
    0

    As a network of democracy support organisations, the European Partnership for Democracy is greatly concerned with the lack of transparency and accountability mechanisms governing online platforms, despite their major impact on democratic processes. We therefore welcome the Digital Services Act (DSA) proposal by the European Commission as an important first step towards safeguarding our democratic principles in the online public square. To this end we outline recommendations on two key aspects of the DSA for democracy, that we ask you to take forward in negotiations.

    Transparency of advertising
    Online advertising drives excessive data collection, personal data abuse and intrusive targeting. In addition to the large-scale privacy breaches that continue to escape GDPR enforcement, this situation has led to the manipulation of democratic elections and the distortion of online public debate around the world. The DSA has a chance to set this right:
    We welcome Articles 22, 24 and 30 on advertising transparency, advertising repositories and the traceability of traders, and believe these provisions should be maintained. This is in line with demands from a coalition of 31 CSOs calling for universal ad transparency.
    In order to ensure meaningful transparency, the advertising repositories of Very Large Online Platforms should also include:
    – User access to their dynamic advertising profiles;
    – Audience reached;
    – Engagements with the ad;
    – GDPR basis for data processing and data source.

    Assessing and mitigating systemic risks
    In the face of the COVID-19 pandemic, platforms have shown their ability to counter the societal harms that their services amplify and accelerate, such as COVID-19 disinformation. Yet the storming of Capitol Hill has highlighted platforms’ inability or unwillingness to intervene without political pressure to do so. Platforms thus seem ill-equipped and unsuited to identify and mitigate systemic risks as identified in the Digital Services Act.
    While the principles of risk assessments (Article 26), risk mitigation (Article 27) and independent audits (Article 28) are appropriate, the Commission proposal needs to be refined in the following ways:

    – The risks detailed in Article 26 need to be more clearly phrased and defined. Broad terms like “civic discourse” and “intentional manipulation” of electoral processes risk being so vague that they will either not be applied, or implemented in a way that may harm freedom of expression and the press.
    – Risk assessments and decisions on mitigation measures should be conducted through an inclusive, transparent and participatory process with participation from digital services coordinators, data protection authorities and media regulators, as well as civil society and rights groups, media associations and representatives of minority and disadvantaged groups. Decisions such as demoting particular kinds of content cannot continue to be taken without any oversight from free speech advocates. Such decisions need to undergo rigorous checks for their potential impact on people’s right to opinion formation, freedom of speech, freedom of the press and non-discrimination.
    – Codes of conduct that may trigger investigations into Very Large Online Platforms upon non-compliance are the right tool for defining systemic risks and stipulating the participatory process for risk assessments.
    Risk assessments reports and mitigation measures need to be made publicly available, at minimum in an edited form.
    – While audits are an excellent accountability mechanism, such audits should be conducted by a public oversight agency such as the European Board of Digital Services, rather than by private sector auditors. This is necessary to ensure consistent standards and methodologies for auditing over time and across platforms. Moreover, it would mitigate the risk of revolving doors between Very Large Online Platforms and the very few auditing firms who would be capable of auditing Very Large Online Platforms.

    “Gold standard” for a safe internet
    While the DSA is EU legislation designed for the EU, similarly to the GDPR, a secondary aim of the legislation is to set a global “gold standard” for ensuring internet safety and accountability online. However, this global view does not take into account the global voices and experiences. This is despite the very real threat that “copycat” legislative measures may pose to the freedom of expression and other fundamental rights in countries. Legislators should be aware of this more global reach and responsibility.

    This statement is endorsed by the Club de Madrid, Democracy Reporting International, ePanstwo Foundation, European Partnership for Democracy, and Political parties of Finland for democracy (Demo Finland).

    Reply
  • 1
    0

    I disagree with Reformed Copyright and Upload Filters. DSA better be giving us more freedom. I may not be a citizen of the EU but I have a few friends there and I care about their freedoms and privacies.

    Reply
  • 0
    0

    The European Women’s Lobby (EWL) welcomes the European Commission’s proposal on the Digital Services Act, the centring of fundamental rights, and objective to provide regulatory clarity to the online space.

    It is disappointing however to review this proposal and see just a singular reference to women and the distinct lack of recognition of the gendered reality of online violence and violations of fundamental rights. Violence against women and girls in the digital space is a part of the continuum of violence and not a separate phenomenon. It is clear that digital spaces present new or more complex ways to perpetrate all forms of violence, including sexual exploitation.

    The EWL reiterates that the European Union has a legal obligation to mainstream equality between women and men into all of its activities as per the TEU. This has similarly been outlined in the 2020-2025 Gender Equality Strategy, to which the European Commission commits to the prioritisation of combatting online harms that disproportionately impact on women, particularly those subject to intersecting forms of discrimination.

    Thus far, the EU has failed to address this with the Council Framework Decision on combating certain forms and expressions of racism and xenophobia, as well as the EU Code of Conduct on combating hate speech (2016) both failing to include women within their provisions. With the delay to the EU accession to the Council of Europe Istanbul Convention, and the lack of uniformity in member state national legislation such as article 67 of Law n° 2016-1321 of the French Criminal Code or Law No. 69 of the Italian Criminal Code (Article 612), the European Union’s lacks a harmonised framework in order to address online violence.

    We believe this proposal must therefore form part of and contribute to a robust EU regulatory framework, grounded in an intersectional feminist approach to combat violence against women and girls, specifically those enacted through digital services and platforms. This proposal should form part of this framework, aligning with other EU legislative initiatives, namely a Directive on preventing and combatting VAWG.

    In specific regard to this proposal, the EWL would highlight the need for
    – The proposal to ensure it complies with the EU’s legal obligations to promote equality between men and women as per Article 2 of the Treaty of the European Union, and to ensure non-discrimination and equality as per Articles 21 & 23 of the Charter of Fundamental Rights.
    – Further clarification on effective gender mainstreaming and the inclusion of an intersectional approach within aspects relating to reporting, content moderation, complaints mechanisms and dispute settlement mechanisms (for example, but not limited to Art13, Art14, Art18 & Art23)
    – For consistent and meaningful consultation with women’s civil society to ensure implementation of a gender sensitive approach within adequate sex-disaggregated data collection, oversight mechanisms, mitigation of risks and assessments (for example, but not limited to Art26, Art27 Art35, provisions outlined in Chapter IV)

    In conclusion to this and as addressed in our contribution to the public consultation process, the EWL demands;

    ● The EU must adopt a robust EU regulatory framework, that is grounded in a comprehensive Directive on preventing and combatting violence against women and girls, recognising the continuum of violence and built on an intersectional feminist approach. The Directive would act as a cornerstone for the harmonisation with the Digital Services Act proposal, and any potential initiative on hate speech and crime.
    ● The European Commission to expand the list of EU crimes under Article 83(1) TFEU to include ‘violence against women and girls’ as a serious violation of women’s human rights needing to be addressed ‘on a common basis’ by all EU Member States, thereby ensuring the illegality provisions outlined within this proposal apply to online violence against women and girls.

    The EU has the duty under its legal obligation to ensure equality between women and men; as well as the opportunity to ensure that a European approach to a digital environment based on fundamental rights and trust is inclusive of women – 52% of the European population – in its strategy.

    Reply
  • 0
    0

    The European Women’s Lobby (EWL) welcomes the European Commission’s proposal on the Digital Services Act, the centring of fundamental rights, and objective to provide regulatory clarity to the online space.

    It is disappointing however to review this proposal and see just a singular reference to women and the distinct lack of recognition of the gendered reality of online violence and violations of fundamental rights. Violence against women and girls in the digital space is a part of the continuum of violence against women and not a separate phenomenon. It is clear that digital spaces present new or more complex ways to perpetrate all forms of violence, including sexual exploitation.

    The EWL reiterates that the European Union has a legal obligation to mainstream equality between women and men into all of its activities as per the TEU. This has similarly been outlined in the 2020-2025 Gender Equality Strategy, to which the European Commission commits to the prioritisation of combatting online harms that disproportionately impact on women, particularly those subject to intersecting forms of discrimination.

    Thus far, the EU has failed to address this with the Council Framework Decision on combating certain forms and expressions of racism and xenophobia, as well as the EU Code of Conduct on combating hate speech (2016) both failing to include women within their provisions. With the delay to the EU accession to the Council of Europe Istanbul Convention, and the lack of uniformity in member state national legislation such as article 67 of Law n° 2016-1321 of the French Criminal Code or Law No. 69 of the Italian Criminal Code (Article 612), the European Union’s lacks a harmonised framework in order to address online violence.

    We believe this proposal must therefore form part and contribute to a robust EU regulatory framework, grounded in an intersectional feminist approach to combat violence against women and girls, specifically those enacted through digital services and platforms. This proposal should form part of this framework, aligning with other EU legislative initiatives, namely a Directive on preventing and combatting VAWG.

    In specific regard to this proposal, the EWL would highlight the need for
    – The proposal to ensure it complies with the EU’s legal obligations to promote equality between men and women as per Article 2 of the Treaty of the European Union, and to ensure non-discrimination and equality as per Articles 21 & 23 of the Charter of Fundamental Rights.
    – Further clarification on effective gender mainstreaming and intersectional approach within aspects relating to reporting, content moderation, complaints mechanisms and dispute settlement mechanisms (for example, but not limited to Art13, Art14, Art18 & Art23)
    – For consistent and meaningful consultation with women’s civil society to ensure implementation of a gender sensitive approach within adequate sex-disaggregated data collection, oversight mechanisms, mitigation of risks and assessments (for example, but not limited to Art26, Art27 Art35, provisions outlined in Chapter IV)

    In conclusion to this and as addressed in our contribution to the public consultation process, the EWL demands

    ● The EU must adopt a robust EU regulatory framework, that is grounded in a comprehensive Directive on preventing and combatting violence against women and girls, recognising the continuum of violence and built on an intersectional feminist approach. The Directive would act as a cornerstone for the harmonisation with the Digital Services Act proposal, and any potential initiative on hate speech and crime.
    ● The European Commission to expand the list of EU crimes under Article 83(1) TFEU to include ‘violence against women and girls’ as a serious violation of women’s human rights needing to be addressed ‘on a common basis’ by all EU Member States, thereby ensuring the illegality provisions outlined within this proposal apply to online violence against women and girls.

    The EU has the duty under its legal obligation to ensure equality between women and men; as well as the opportunity to ensure that a European approach to a digital environment based on fundamental rights and trust is inclusive of women – 52% of the European population – in its strategy.

    Reply
  • 0
    0

    Digital Services Act: Steps to Improving Protection of Animals, Consumers, Public Health and Tax Collection

    Context

    Scale of illegal animal trade

    The large majority of cats and dogs from cross-border illegal trade are sold online, and their
    lucrative trade is booming, especially now at the time of COVID-19 when people are craving
    for company. Such a need for a puppy or a kitten can be very quickly satisfied via online
    platforms and social media where animals are advertised. With limited requirements in
    place, poor enforcement and anonymity, it is easy for the market participants to acquire and
    sell animals regardless of their origin, putting animal health and welfare, human health and
    public finances at risk and fuelling criminal activity. The annual trade value of pet trade in
    Europe is estimated in billions of euros, however, due to underreporting and lack of
    standardised data collection, the real scale of the problem is unknown and so are the exact
    numbers of traded animals across European borders.

    Until now, any focus on the criminal pet trade has concentrated on the animal welfare and
    consumer fraud aspects, but it has become clear that illegal trade is structured and
    organised by criminal elements. Both the EU and the Member States recognize the need to
    join forces to tackle this increasingly pressing problem and point to long-term solutions for
    the sake of protecting the European citizens and the Single Market. Similar to the fight of
    central and local governments against drug trade and human trafficking, combatting pet
    trade controlled by criminal gangs must be assisted by mechanisms of support at the EU
    level.

    According to Europol, almost half of the organised crime groups operating in the EU are
    involved in more than one criminal activity. As the illegal pet and wildlife trade carries low
    risks and high profits, it serves either as a diversification of income sources for organised
    crime groups, or as a main activity. Largely controlled by highly organised criminal structures,
    this multimillion euro industry clearly falls under the definition of organised crime, and must
    be addressed accordingly. Estimates of the value of wildlife trafficking alone reach up to EUR
    8 billion to EUR 20 billion annually.

    Wildlife trafficking is one of the most profitable forms of transnational crime along drug,
    human and arms trafficking. The highly organised criminal activity poses a risk to biosecurity,
    biodiversity, wildlife conservation, and is a major animal welfare concern. Interpol reports
    the links of wildlife trafficking to other serious criminal activities and points to the low-risk
    and high-profit ratio as a major driver of environmental crime, stating that 60% of surveyed
    countries recognised a growing sophistication and adaptation to environmental crimes
    committed by transnational organised crime groups. The illegal animal trade has flourished
    thanks to the Internet: consumers and illegal traders have access to a vast international
    marketplace that is open 24 hours a day, seven days a week, 365 days a year.

    Controlling and eradicating invasive alien species, treating people for zoonotic diseases,
    fighting against illegal trade and culling thousands of farm animals in order to prevent the
    spread of illness, are all direct consequences of absent or inadequate laws and regulations
    on exotic pet trade. In recent years, the growing trend for exotic pet keeping includes an
    increasing range of species on sale that are caught in the wild elsewhere around the globe,
    and the commercial trade in captive-bred wild animals.

    Animals illegally traded are often subject to immense suffering, and as 74% of Europeans
    believe the welfare of companion animals should be better protected than it is now, it is
    clear that this issue is of great importance to European citizens. Furthermore, The EU
    Security Union Strategy emphasises the need to tackle emerging threats and the importance
    of acknowledging the complex nature of organised crime. It is evident that illegal animal
    trade plays an important role in the network of criminal activities.

    The illegal animal trade is a booming industry rife with cases of consumer fraud, tax evasion,
    biosecurity concerns and animal abuse. The convergences between animal trafficking and
    other forms of transnational organised crime have been very well-established. The EU and
    Member States must take action to curb this serious form of organised crime, as it threatens
    European citizens, public health and the Single Market.

    Risks of illegal animal trade

    1. The illegal animal trade bares all the characteristics of organised crime, such as drug
    or human trafficking, as mentioned in the EU Strategy to tackle Organised Crime
    (2021-2025).
    2. Biosecurity concerns including bioterrorism. Pets can be used as bioweapons –
    inconspicuous carriers of disease are capable of inflicting serious damage on the
    public health sector, and as a result, destroy economies.
    3. Health of consumers at risk. Rabies, a lethal zoonosis causing 59 000 deaths globally
    each year, is becoming more prevalent in countries such as the Netherlands or France
    due to the influx of unvaccinated puppies from Eastern European countries. Other risks include parasitic and bacterial infections with severe implications for human
    health, such as intestinal worm infestations and alveolar echinococcosis caused by
    Echinococcus multilocularis.
    4. Consumer fraud and violations of import rules. The pet’s health status, country of
    origin, vaccination status, breed and even existence can all be falsely presented to the
    consumer.

    The role of the EU

    The illegal pet trade is a problem of all the EU Member States and only an EU wide solution
    will be able to respond to an EU wide problem. The EU Strategy to tackle Organised Crime
    (2021-2025) confirmed the seriousness of illegal animal trade as organised crime, while the
    Croatian Presidency workshop Illegal Pet Trade: Game Over, surfaced the sentiment among
    the majority of experts, including EU Member States, where 92% declared a need for EU
    wide rules for the trade of pets, 93% stated that online platforms should be made
    responsible for verification of sellers’ information, and 90% indicated that only registered
    cats and dogs can be advertised. It is now up to policy makers to decide to what extent
    protecting the health and welfare of traded pets, consumer rights and public health is
    relevant.

    Suggestions

    Step 1:

    ● Inclusion of illegally traded animals as an example of illegal content in the
    Digital Services Act (DSA) (Recitals 12 and 57)

    The illegal animal trade can cause significant damage to human health and biodiversity, and
    is also very impactful in terms of tax revenue loss. Nowadays, the illegal animal trade has
    flourished thanks to the Internet: consumers and illegal traders have access to a vast
    international marketplace that is open 24 hours a day, seven days a week, 365 days a year.
    Due to impact and scale, illegally traded animals cannot be left out of the DSA’s scope, and
    that is why Recitals 12 and 57 referring to the notion of illegal content must therefore
    explicitly contain illegally traded animals as an example of activities that are illegal.

    ● Application of programming interfaces of trusted flaggers (Recital 46)
    Technology is evolving at a high pace, that is why it can be hard to control illegal online
    content. To circumvent that, the DSA should not only require in its Recital 46 the prior
    submission of trusted flaggers through notice and action mechanisms but also enable the
    implementation of programming interfaces of trusted flaggers provided at the service of
    online platforms.

    Step 2: Inclusion of closed groups and redefinition of the notions of ‘advertisement’ and
    ‘content moderation’ (Recital 14 and Article 2)

    Closed groups are also a means of information dissemination to the public. Many
    advertisements, posts and publications are on closed groups with no direct financial return
    to the platform. Closed publications from individuals purposefully misrepresenting
    themselves and their interests have massively impacted major events such as political
    elections in the US in 2016. The notion of advertisement should not be only considered when
    the platform has a financial gain for promoting the information. For that reason, the part
    ‘against remuneration specifically for promoting that information’ must be removed. The
    notion of content moderation must also include taking measures of prevention when
    detecting, identifying and addressing these illegal content or information incompatible with
    their terms and conditions.

    Step 3: Inclusion of all enterprises in Section 3 and verification of the validity of the
    provided information, inclusion of product safety information and introduction of an
    authorised administrator (Articles 16, 22.1 and 22.2)

    In order to avoid any users’ miscomprehension on the origin and purpose of advertisements
    and products, the platform must not only obtain the information required to the supplier but
    also verify the given information. Within the information, the data related to product safety
    such as product labelling and registration number (where applicable) also need to be
    obtained and checked. Finally, the publication of information on an online interface should
    be authorised by a designated administrator or Member States or the European Union.

    Step 4: Obligation to report illegal activities and responsibility of the provider to not allow
    any post which they know to be false or misleading to promote the selling or supply of
    products (Articles 21 and 24)

    In line with the consumer protection Directive No 2005/29/EC (amended by the Directive No
    2019/2161 on better enforcement and modernisation of EU consumer protection rules) that
    prohibits traders from creating a false impression on the nature of the products, a platform
    must not enable publications and posts which they know to be false or misleading with the
    objective to sell or supply goods from the publication or display being made. In addition,
    when the platform has a suspicion of any general illegal activities (not only of a serious
    criminal offence) operating on the platform (including illegal pet/wildlife trade), it must
    report such activities to national authorities.

    Reply
  • 0
    0

    * Involve users and civil society
    **Formalize the connection between existing support structures (e.g. women’s advice centres, LGBTIQA+ support groups) and the Digital Services Coordinator(DSA, Art. 2 (l)) for groups most vulnerable to experience digital violence. Grant these organizations verified accounts and access them trusted flagger rights to help appeal against unjustified content take down, shadow banning and account blocking. This concerns e.g. doxxing as well as identity theft or releasing unauthorized information about user’s (sexual) orientation.
    **Grant a single point of contact to civil society and representative user groups. This helps to formalize connection between existing support structures (e.g. women’s advice centres, LGBTIQA+ support groups) for groups most vulnerable to experience digital violence and who are in need of help though a central points of contact. (DSA, Art. 10 (1))
    **Decentralize decision-making through Social Media Councils. Tie decision making of controversial cases to a Social Media Council according to the rules of fair discussion open for a public debate. This applies especially to content that is not illegal but taken down because it might not fit the norms of e.g., societal nudity, body concepts etc. Social Media Councils should function as an independent and external body to complaint and user help in this matter.
    **Oblige platforms to independent community management that functions and mediate between the content moderation team and the users that feel their content/account has been wrongfully blocked and make them part of the support structure of out of court settlement. The Community management should focus on mediation and help users as well as content moderation teams to solve their case. An independent community management can also be part of helping the user´s rights. (DSA Art 18 (1))
    *Educate and prevent through EU programmes
    **Inform users/citizens about their rights to report content is part of media literacy and citizen education. Easily accessible language is a step to understand terms and condition and one`s own right in it. Evaluate in as much users / citizens are aware of their rights when it comes to understanding terms and conditions e.g. reporting of content, where to seek for help in case of discrimination, fraud etc. (DSA, Art.12 (1))
    **Support counter speech initiatives through citizen-led and educational EU Programs that empower civil society similar to the German initiative “Competence Centre: Hate on the Net” for young people.
    **Support research and independent report evaluation that creates a deeper understanding about the way social norms and sanctions are distributed in online communities and enhance self-regulation. To understand how alternative content moderation is conducted helps to prevent the upload of illegal content and prevents discrimination. (DSA, Art. 31)
    **Foster research and independent report evaluation that creates a deeper understanding about the way social norms and sanctions are distributed in online communities and enhance self-regulation. To understand how alternative content moderation is conducted helps to prevent the upload of illegal content and prevents discrimination. (DSA, Art. 31 (1))

    *Introduce measures for a sustainable digital ecosystem with diverse platforms
    **Introduce measures for a sustainable digital ecosystem with diverse platforms that do not only punish VLOPS but also promote and reward small actors that are performing well in the field of content moderation, instead of creating penalties to lower the power of economically driven platforms that have become more powerful.
    **Be aware of the growth in user numbers of social media platforms and their development. Until 2025 a growth of about 30 Million internet users is expected in the EU, markets might shift under the DSA. Smaller niche platforms might grow big and will have to fall under the law. Those platforms might suppress their growth in order not to face high penalties. Operational threshold needs to be regularly revised as the DSA in practice will probably cause changes to numbers and markets that are not considered today. Another aspect is the definition of a “proper user” (accounts, logins, unique logins etc.). (DSA Art. 25 (1))
    **Introduce a model of a ratio that gives an orientation to the investment of labour resources in content moderation
    (Plattform content uploaded per hour)/(Hours of work required by moderator )= Required labour resources

    This accounts especially for the industrial content moderation systems, which tend to work to exploitation logic and to minimize cost for maximum to profit. Here the ratio model should help to have a discussion around quality criteria. The “required labour resources” can be frequently revised and fitted through hearing and discussions with digital coordinator on a national level. Guiding principle is a sustainable digital (informational and social) ecosystem that is human-rights and social cohesion led.
    **Differentiate between different content moderation types (industrial, artisanal and community-led type). VLOPS tend to use the industrial type, which functions with social media content factory-like and with no user involvement. Smaller platforms tend to work with community-participation in their moderation types, which leads to different results in acceptance and the amount of moderation. Enhance best practices and develop innovative measures in the field of moderation as well in community engagement. Thus, smaller platforms can compete and offer alternatives in the market of social media platforms and are part of a diverse digital ecosystem.
    **Make trusted flaggers transparent. Who and how to get the status of trusted flagger (publish criteria). Trusted flagger status should be issued only for a limited amount of time and should be regularly revised. (DSA, Art. 19)
    **Issue trusted flagger status only for a limited amount of time and revise regularly. There should also be a frequent exchange with moderation teams and Digital Services Coordinator. (DSA, Art.19 (2))
    **Oblige platforms to make reporting procedures available easily (3-Click-Rule), in a user-centred design and in plain language. Prohibit dark patterns in reporting structures. (DSA, Art. 14 (1))
    **Supervise actual content moderation labour in the EU by people who speak the language. Assure psychological support for content moderators, make their working conditions transparent, and assure a certain level of training and education, create professional standards in the field.
    **Document illegal content transparently, e.g. similar to the GIFCT and hash Database (Photo DNA) via an independent agency for research and supervision (e.g. Lumendatabase).
    **It should be more transparent in detail what kind of illegal content , e.g. similar to the GIFCT and hash Database (Photo DNA) via an independent agency for research and supervision (e.g. Lumendatabase). This can help to safeguard and control unlawful take-downs. (DSA, Art. 13, (1) (b))
    **A good example for this kind of approach concerning illegal content is the Lumendatabase DSA, Art 15 (4))
    **Create roundtables and make the platforms exchange on specific cases that interrelate between platforms (cross-platform abuses, specific events) for all platforms – not only those one who are obliged to crisis protocols etc. (DSA, Art. 37 (2))

    *Content Moderation practices that consider intersectionalities of discrimination
    **Review content that is harmful or classified as “dangerous speech” should be reviewed on a case-by-case basis.
    **Reporting categories should be “Discrimination of content of body positivity, ableism, queer body concepts” – to create a sensitivity of the body-normative material that is sighted and to give space for deviating aesthetics in automated reporting systems. Here the decisions usually follow a logic of internal reporting categories, which are often limiting to handle the compliant appropriate. E.g. reporting categories could be “Discrimination of content of body positivity, ableism, queer body concepts” – to create a sensitivity of the body-normative material that is sighted (and taken down). Furthermore it would be good to understand how re automated reporting systems involved in these complaint-handling decisions. (DSA, Art. 13 (1) (d))
    **Audit A.I. based decision making, especially with the aspects of favouring specific, exclusive groups, topics, or individuals. It is not enough to understand that automated means are used to process the notice and to be transparent about it. Here Indicators of the automated decisions need to be cleared up further as well as what kind of safeguards are applied in what way. (DSA, Art 23 (1) (c))

    Reply
  • 0
    0

    It is a very good that the DSA preserves the liability exceptions of the e-commerce directive including the ban on general monitoring. A clear set of responsibilities that are aligned with the order of influence on the content shared on the internet are vital to provide for a proper balancing between the right to conduct a business of service providers and the protection of fundamental rights of everyone involved. It is also vital for ensuring that recipients of intermediary services can properly exercise their fundamental rights, like the freedom of expression in particular.

    What the DSA seems to be missing is a clear concept around the liability exceptions, which ensures freedom of expressions is not unreasonable impaired, provides accountability and thus proper protection of other fundamental rights, and helps with the allocation of services into the different categories of intermediary services.

    In fact it fails to properly safeguard pure infrastructure providers from unnecessary moderation requirements, leading not only to unreasonable double moderation and thus an unjustified impairment on their freedom to conduct a business, but also a massive restriction on the exercise of freedom of expression.

    There is so far also the problem of many intermediary services like DNS services or CDNs not knowing if or under which liability exception they fall.

    A general concept

    For the purpose of providing a clear concept around the liability exceptions I would suggest differentiating between 3 categories of roles within communication networks: Infrastructure providers, service providers and content providers, with the last one usually consisting of the recipients of services. The current liability exceptions are reasoned around a lack of control and knowledge over the information saved or transmitted by intermediary services. Taking knowledge aside, I would suggest streamlining the responsibilities of services on the level of control they have over information: no control or ex post control:

    Infrastructure provider: Primarily enables/supports service providers in offering their service; has no control over information; acts as gatekeeper for other services; → no liability, no content control requirements, strict requirement to enable the exercise of fundamental rights.

    Service provider: Primarily enables content providers to use the communication network; has only ex post control over information; acts as gatekeeper for their recipients; → conditional liability exception; content moderation requirement; limited requirement to enable the exercise of fundamental rights.

    Content provider: Provides the actual content; has full and in particular ex ante control over information; → Full liability; implicit requirement to respect fundamental rights.

    For the sake of clarity, by “control” over information I mean the level of influence someone has on the material content of the information stored or transmitted. The content provider is usually the one who creates the content in the first place, or at least gets to see it beforehand. As such they are in the position to manually check its compliance with all applicable laws and should therefor be fully liable for that content, as it currently is.

    Intermediary service on the other hand are by design usually not actively involved with the content. The content gets automatically saved or transmitted by them, without any manual review beforehand, implying they have no knowledge of the content. Furthermore they do not change the content, and usually have no other direct influence on the material content. Given so they also have no control over the content. This complete lack of any proactive involvement with the content saved or transmitted by them is why these intermediary services are considered passive in nature.

    The only control intermediary services have over content, if at all, is an ex post one. Hosting services in particular can remove or disable access to content saved on their servers, or in theory even change it. This puts them into the position to moderate the content.
    It is thus reasonable to expect service providers which are capable (by design) to moderate their content to engage in such an ex post moderation. Given so they should be given immunity from liability under the condition that they engage in proper content moderation.

    Infrastructure providers are in contrast either not in the position to engage in an ex post control of the content, or they can usually already rely on a proper moderation by the services which they host or assist. As such it is unreasonable to expect them to engage in content moderation and they should therefor be granted unconditional immunity. Unconditional means that the liability exception should only require the service to be tailored to the design of an infrastructure provider, like e.g. a “mere conduit” provider only sending information but not storing it, and should not require further proactive actions like engaging in content moderation.

    The above list provides a proper allocation principal to distinguish between infrastructure and service providers: An infrastructure provider is tailored to primarily engage with service providers, and to engage with content providers solely to enable them to reach the service providers. As such they can rely on the necessary moderation being performed by a third party. Services which can not claim there is another service aided by them which could do the moderation are to be allocated into the service provider sphere. Only such services which can reasonable expect others to perform the moderation, or are incapable to perform such a moderation, should be allocated into the infrastructure provider sphere.

    This way proper accountability gets ensured. This could also be strengthened by a KYBC requirement between infrastructure providers and service providers. This would ensure at least someone is being hold accountable if illegal content gets spread by a service on purpose.

    This leads us to another problem: Infrastructure providers are the ones enabling service providers to operate in the first place, and do therefor act as gatekeepers. If they refuse to support services which are offering specific content protected by free speech, like anything not being illegal, no service can offer such content. There are already net neutrality rules which mandate access providers not to prefer or disadvantage any service in order to enable proper competition, but also to protect free speech. The same rules MUST apply to ALL infrastructure providers, like DNS or cloud services, since their refusal to support/host a certain service has the very same effect.

    Service providers on the other hand should not be mandated to allow everything that is legal. If it is guaranteed that there are other services which host content they do not want to host, the exercising of the right of free expression for the end users is still ensured. Yet especially larger online platforms should be mandated to not to limit free expression arbitrarily, and in particular must not be allowed to ban or disadvantage certain content based on the opinion expressed by it.

    Missing liability exceptions for infrastructure providers

    For the sake of clarity, with cloud providers I mean hosting services which are letting their recipients rent servers or are otherwise giving them the resources to run arbitrary services, like blogs, shops, boards, personal homepages, etc.

    Cloud providers, which are acting as gatekeepers to all other service providers and other hosting providers in particular, are currently forced to exclude the use of their service for the provision of often problematic but still legal content, to reduce the likelihood of having to engage in content moderation on the services hosted by them. This is because the exception for hosting services in Article 5 does not distinguish between services only offering the technical resources, and the ones running the actual hosting service. As such cloud services do either directly share the moderation requirement with the hosting service, including the necessity to process notices and ensure illegal content gets swiftly removed, or they do not enjoy any liability exception at all.

    This double moderation requirement puts a disproportionate burden on cloud providers, which they try to reduce be expelling any content that is likely resulting in reports, irrespective of whether the content in question is legal or protected by free speech.

    For example a paragraph in the TOS of Hetzner, one of Europe’s biggest cloud providers, reads:

    “The client undertakes not to publish content that may violate the rights of third parties or otherwise violate the law. The placement of erotic, pornographic, extremist material or material not deemed in good taste is not permitted. We are entitled to block access to the account of any customer who violates this. The same applies in the event that the customer publishes content which is capable of violating the rights of individuals or groups of people, or that insults or denigrates these people.”

    The most problematic parts are the wordings “may violate…” and “not deemed in good taste”. As pointed out, cloud providers want to make sure they will not receive any notices of illegal content to process, and as such prohibit anything that “may” be illegal, and not just what IS illegal. Following the same logic, they also prohibit even anything that is “not in good taste”, since such content may also encourage people to report it, even if such reports are entirely unfounded from the beginning.

    Practically all major cloud providers have similar provisions in their TOS giving them the freedom to expel any service that contains content with a possibility of being illegal or which is likely to result in many complains.

    The result of this is that it is impossible to host a service which allows its users to fully exercise their right of free speech, since cloud providers would not allow such a service to exist in the first place, even if it ensures all illegal content gets swiftly removed.

    In order for this restriction on free speech to be overcome it needs essentially two adjustments: Cloud providers need an unconditional liability exception so they bear no risks from allowing all legal content, and they need to be required by law to provide their service to any service that is in compliance with the law, making sure they do not abuse their power to limit the availability of legal content which they, for whatever reason, do not like or do not want to support.

    I would suggest adding a new liability exception:

    Cloud
    1. Where an information society service is provided that consists of the provision of technical resources aimed at enabling another information society service to offer its service, the service provider shall not be liable for the activities of the other service provider, including the transmission of information in a communication network or the saving thereof, on the condition that the provider:
    a. does not have actual knowledge of the other service’s primary purpose being the engagement in, or the support of, criminal activity; or
    b. upon obtaining such knowledge, acts expeditiously to cease providing its service to the other service, including the removal or disabling access to illegal content saved on that other service.
    2. Paragraph 1 shall also apply in cases where the same technical resources are provided to a natural person for their personal use only.
    3. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.

    Conversely Article 12 should be amended by two new paragraphs:

    3. Providers of intermediary services which enable other information society services to operate may not exclude in their terms and conditions the use of their service for services,which are not in breach of any applicable union or national law. All services operating in compliance will all applicable laws must be capable of using their services.
    4. Paragraph 3 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.

    There are even more infrastructure providers which may be falling into the current “hosting” category, and are therefor required to engage in unnecessary content moderation.

    The first one are DNS providers. All they do is to provide a mapping between a domain name like duckduckgo.com, and the IP address it belongs to. They are needed so computers know which IP to contact if somebody enters such an easily readable domain name into the browsers address bar. Given so their service is tailored to a purely technical aspect, and furthermore they have zero influence on the content hosted on the corresponding domains.

    From a technical point of view they are hosting information – the domain name and the corresponding IP – provided by recipient of their service, the website the domain name belongs to.
    As such they are hosting providers and are required to remove any illegal content once notified. Since the provision of the domain amounts to assisting the spreading of illegal content if the domains website contains illegal content, the domain itself becomes illegal content in this case. The DNS provider would in theory be required to block any domain it becomes aware of containing illegal content at the very moment of attaining this awareness. Note that this does even apply in the case that the DNS provider can be sure that the illegal content on the linked website is soon to be removed from the website operator himself.

    This legal uncertainty puts the proper functioning of the internet at risk and should be removed. I suggest adding a new liability exception for DNS providers, which may also benefit future similar services:

    Contact
    1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service if the sole purpose of that information is to enable others to contact the recipient of the service over a communication network on a technical level.
    2. Paragraph 1 shall also apply in cases where the saved contact information relates to a third party, if that third party has requested the recipient of the service to store that information on the service on his or her behalf.
    3. Paragraph 1 shall not apply in respect to additional information stored by the service which is unrelated to contacting the recipient of the service on a technical level. This does not preclude the saving of additional data which is relevant for the contacting of the recipient on a technical level, such as options or preferences relevant for connections and messages.
    4. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    5 This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.

    A similar problem exists with some CDNs. They host content for other hosting services, taking away the load on these hosting services main servers. Some CDNs operate in a such a way that they fulfill the requirement for “caching” as defined in Article 4. Yet some CDNs host content directly instead of retrieving it from another server, rendering them ineligible for the “caching” exception and making them fall under “hosting”.

    Again this leads to a double moderation requirement. While the actual hosting service assisted by the CDN has usually the capabilities to remove any of their content currently co-hosted by the CDN if they find out it is illegal, the CDN would in theory still be required to remove it too if notified about it.

    But most CDNs to not have any notice mechanism available, since they are acting on a pure technical level hidden from end users. The content appears as belonging to the assisted host, as it actually is in practice. But since hosting services are now mandated to provide for “easy to access” and “user-friendly” notice mechanism, the provision of such CDNs becomes technically impossible without violating the law.

    I would therefor suggest requiring such auxiliary services to allow the assisted host to still engage in content moderation with the content hosted at the CDN, but giving it an unconditional liability exception otherwise:

    Auxiliary
    1. Where an information society service is provided that consists of the storage of information provided by another information society services, or the processing thereof, performed for the sole purpose of assisting the other services in providing its service, the service provider shall not be liable for the information saved or processed provided by the other service, on the condition that the provider:
    a) does only modify the information as requested by the other service; and
    b) enables the other service to remove or disable access to the information stored.
    2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    3 This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.

    Such a new exception would also benefit other auxiliary services, like services facilitating the rendering of videos in multiple resolution. But again, it might be reasonable to employ a KYBC requirement on such auxiliary services to ensure they are unlikely to be abused by illicit services.

    Reply
  • 1
    0

    This threshold of 45 million seems quite arbitrary to me. A better justification based on impact and a more flexible threshold are needed.Reference

    Reply
  • 1
    0

    This threshold of 45 million/10% seems quite arbitrary to me. A better justification based on impact and a more flexible threshold are needed.Reference

    Reply
  • 1
    0

    We would prefer if the term “objective” is replaced by the term “non-arbitrary”. The reasoning is that the latter term is better framed in law and jurisprudence. It also more clearly defines what the provider is expected to do.Reference

    Reply
  • 2
    0

    The paragraph is written in a way that suggests that a notice gives rise to actual knowledge of illegal content. The wording is not as clear as it should be regarding such an important concept.

    We would suggest to re-word the present paragraph. It should be clear that notices give rise to knowledge of information specified in the notice. Notices may or may not be proof of illegal content.Reference

    Reply
  • 0
    0

    For definition of “illegal content”, It’s questionable if content itself can be illegal, or if only actions of people who write, modify, transmit, copy, use, sell, buy, publish or republish the content in certain situations would be illegal.

    For definition of “online interface”, the current definition is not very good. proposed alternative: “a communication channel between two or more people located in different places or between a person and a computer system processing information from the person, where the communication occurs via the internet or otherwise through the public communication infrastructure.”Reference

    Reply
  • 0
    0

    How does this power to conduct on-site inspections interact with the laws of the country where the inspected organization is based on? Is this power limited to be available within the EU?Reference

    Reply
  • 1
    0

    If a hosting provider hosts end-to-end encrypted content, it can be made aware of the fact that the content is illegal, but it cannot verify this. This mechanism of informing the provider can then be used maliciously to censor unwanted content without it being illegal.Reference

    Reply
  • 0
    0

    It should be clarified if unidirectional broadcasters, such as Netflix, national television or radio broadcasters, or company public websites that do not allow the public to contribute are considered as included to definition, because they allow a privileged collection of people or organizations to publish content on the internet.Reference

    Reply
  • 0
    0

    The categorisation of services lacks two main-elements: 1) There are no criteria for the allocation of services. 2) The proposal lacks a clarification on how the categories align with categories established by other legislation. The DSA still commits to the underlying cascade of intervention options (content provider, host provider, access provider) which for an effective intervention of media regulation results in a necessity for clear allocations. A categorisation of services should especially take into account the uprising of hybrid services. The criteria used for that purpose cannot rely on information authorities will have to ask a service to provide.Reference

    Reply
  • 1
    0

    This article as well as art. 9 only oblige platforms to answer to an order. A legal consequence is lacking. As such, these provisions mirror the status quo ante in cross-border enforcement. Moreover, the proposal lacks a statement on how the order is to be delivered legally compliant. It should be safeguarded that e procedure under Art. 8 does not fail because the order has not been delivered orderly.Reference

    Reply
  • 0
    0

    The provision to act against a concrete content on a case-by-case basis lacks behind the reality of media regulation as systemic issues like the lack of an age verification system are common. The proposal should include the possibility to place an order against a systemic failure by an intermediary service. Moreover, the term “illegal” is too vague and could be interpreted in a way that only content is in scope that is prohibited under criminal law. This issue may be clarified, e.g. by using the term “non-legal” instead.Reference

    Reply
  • 0
    1

    The obligation to compose the order in a language chosen by the platform is disproportionate. It opens up inacceptable possibilities for providers to place an unreasonable hurdle to the acting authority. Instead, giving providers the chance to choose between the European working languages (English, French, German) should be more appropriate.Reference

    Reply
  • 0
    0

    This article as well as art. 8 only oblige platforms to answer to an order. A legal consequence is lacking. As such, these provisions mirror the status quo ante in cross-border enforcement.Reference

    Reply
  • 0
    1

    There is an implied possibility for intermediaries to use their terms and conditions to restrict the use of their platforms beyond the legal provisions. As far as editorial content by media services are concerned, this possibility could lead to a double control because such services are already controlled by the competent authorities (broadcasts and on-demand services) or working self-regulation (press). Such a double control marks a threat to the freedom of speech and the freedom of media and press because intermediaries could de facto set standards for the freedom of speech instead of the legislator due to their market power. The DSA has to ensure that the legislators’ prerogatives are valid also when defining the limits to freedom of expression. A possible solution might be to explicitly prohibit intermediaries to deviate from legal provisions in their terms and conditions. On the other hand, through a clear distinction from the sector-specific media law, it could be clarified that offerings already controlled by a functioning media regulator or self-control are not charged with terms and conditions by intermediaries additionally.Reference

    Reply
  • 0
    1

    The proposal differentiates certain provisions and intervention thresholds based on the size of a platform. From a media law perspective, such a differentiation does not meet the reality of regulation. Especially when dealing with inciting contents, we regularly observe a shift to (then still) smaller platforms without a decrease of the threat to fundamental democratic values. A differentiation based on the size of a platform should only be done in the area of legal consequences.Reference

    Reply
  • 0
    0

    The obligation for member states to nominate a Digital Services Coordinator interferes deeply with inner structures of the member states, which is especially problematic in the area of media regulation. The proposal forces the member states or at least the different regulatory authorities to de facto establish a system of hierarchy. Possible solutions: Coordination should happen on the member state level by using existing European structures (ERGA, BEREC, CPC, Data Protection Board, European Competition Network) and delegating the members of the Digital Services Board directly from the networks and groups. Similar to the Council of Ministers principles, the Digital Services Board could thereby ensure that the fitting sector-specific part becomes active and functions as a mediator to the sector-specific national authority when the DSA is applied.Reference

    Reply
  • 0
    0

    The provisions for cross-border cooperation of the Digital Services Coordinators not committal enough. By that, they fall behind existing forms of such cooperation. ERGAs Memorandum of Understanding, for which the Council demands support explicitly in its Council conclusions from the 11th of November 2020, for instance includes way more consequent provisions for the territorially competent authority and does not leave behind the authority in the country of destination without a result. The governance structure of the DSA should orient more strongly on existing forms of cooperation and pick up proven concepts. An action by the authority in the country of destination must be possible as ultima ratio.Reference

    Reply
  • 0
    0

    The escalation stage via the inclusion of the European Commission gives it too much room for own discretion and too long deadlines (bullet points: subsidiarity, independence, state neutrality of supervision). This endangers the level of protection already reached. The sovereignty of the authority in the country of destination has to be preserved, despite the involvement of the Commission, and the deadlines need to be shortened.Reference

    Reply
  • 0
    0

    The Commission foresees too many competences and final decision-making options for itself when it comes to very large platforms. These ideas recognizably borrowed of rules concerning the defence against terrorism do not meet the specificities of media regulation. Such a governance structure is not compatible with the subsidiarity principle. Moreover, it is not to bring in line with the need for independence the media regulation is subject to. Both principles are not without reasoning subject of the just revised AVMSD. They are essential for securing a stable, democratic media environment and the protection of fundamental European protected goods. A solution could again be the reliance on existing structures within existing European networks like ERGA, BEREC, the Data Protection Board or the European Competition Network. Members of these existing networks should form the Digital Services Board.Reference

    Reply
  • 0
    0

    Attempting to apply EU law to the whole world will result in EU citizens’ access to large numbers of non-EU web sites and apps being blocked by the web site operator.

    The Very Large Platforms are already established in the EU and can be regulated. This provision is self-defeating.Reference

    Reply
  • 0
    0

    Web site operators (that carry User Generated content) will not establish a representative officer within the EU, except for the very largest. They will either block access to their website from the EU, or simply ignore this law.

    Internet fragmentation not in the interests of EU citizens.

    The EU should continue to promote the country of origin principle as the cornerstone of a global, universal Internet.Reference

    Reply
  • 1
    1

    This definition makes a buyer a “trader” too.

    As a result, privacy for people buying on online marketplaces will be abolished by the KYBC proposals in Article 22.

    The fix is to restrict the definition of trader to those selling.Reference

    Reply
  • 1
    0

    This should be open to public scrutiny, not behind closed doors.Reference

    Reply
  • 0
    0

    This basically allows the VLOPs to propose new laws binding their services and the DSCs to enact those laws. Without public scrutiny or democratic accountability.

    Either this should be withdrawn completely, or at least subject to an overriding right for third parties (citiziens or businesses) affected by those decision to challenge the quasi-law agreed between the DSC and the VLOP.Reference

    Reply
  • 1
    0

    Article 22 already seems to address this: “… it shall ensure that traders can only use its services to promote messages on or to offer products or services to consumers located in the Union if…”Reference

    Reply
  • 0
    0

    Add new recital 4
    “n/a As Party to the United Nations Convention on the Rights of Persons with Disabilities (UN CRPD), provisions of the Convention are integral part of the Union legal order and binding upon the Union and its Member States. The UN CRPD requires its Parties to take appropriate measures to ensure that persons with disabilities have access, on an equal basis with others, to information and communications technologies and systems, and other facilities and services open or provided to the public, both in urban and in rural areas. General Comment No 2 to the UN CRPD further states that “The strict application of universal design to all new goods, products, facilities, technologies and services should ensure full, equal and unrestricted access for all potential consumers, including persons with disabilities, in a way that takes full account of their inherent dignity and diversity.” Given the ever-growing importance of digital services and platforms in private and public life, in line with the obligations enshrined in the UN CRPD, the EU must ensure a regulatory framework for digital services which protects rights of all recipients of services, including persons with disabilities. Declaration 22 annexed to the final Act of Amsterdam provides that the institutions of the Union are to take account of the needs of persons with disabilities in drawing up measures under Article 114 TFEU. “Reference

    Reply
  • 0
    1

    Add new recital 5
    “n/a Given the cross-border nature of the services at stake, EU action to harmonise accessibility requirements for intermediary services across the internal market is vital to avoid market fragmentation and ensure that equal right to access and choice of those services by all consumers and other recipients of services, including by persons with disabilities, is protected throughout the Union. Lack of harmonised accessibility requirements for digital services and platforms will also create barriers for the implementation of existing Union legislation on accessibility, as many of the services falling under those laws will rely on intermediary services to reach end-users. Therefore, accessibility requirements for intermediary services, including their user interfaces, must be consistent with existing Union accessibility legislation, such as the European Accessibility Act and the Web Accessibility Directive, so that no one is left behind as result of digital innovation. This aim is in line with the Union of Equality: Strategy for the Rights of Persons with Disabilities 2021-2030 and the Union’s commitment to the United Nations’ Sustainable Development Goals.”Reference

    Reply
  • 0
    0

    Add new recital 6
    “The notions of ‘access’ or ‘accessibility’ are often referred to with the meaning of affordability (financial access), availability, or in relation to access to data, use of network, etc. It is important to distinguish these from ‘accessibility for persons with disabilities’ which means that services, technologies and products are perceivable, operable, understandable and robust for persons with disabilities.”Reference

    Reply
  • 0
    0

    (b) set out uniform rules for a safe, accessible, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected.Reference

    Reply
  • 0
    0

    replace Directive (EU) 2010/13/EC with Directive (EU) 2018/1808;Reference

    Reply
  • 0
    0

    add new r.
    ‘persons with disabilities’ means persons within the meaning of Article 3 (1) of Directive (EU) 2019/882;Reference

    Reply
  • 0
    0

    Due diligence obligations for a transparent, accessible, and safe online environmentReference

    Reply
  • 0
    0

    Add Article 10 new – Accessibility requirements for intermediary services
    “1. Providers of intermediary services which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882.
    2. Providers of intermediary services shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in written and oral format, including in a manner which is accessible to persons with disabilities. Intermediary service providers shall keep that information for as long as the service is in operation.
    3. Providers of intermediary services shall ensure that information, forms and measures provided pursuant to Articles 10 new (9), 12(1), 13(1), 14(1) and (5), 15(3) and (4), 17(1), (2) and (4), 23(2), 24, 29(1) and (2), 30(1), and 33(1) are made available in a manner that they are easy to find, accessible to persons with disabilities, and do not exceed a level of complexity superior to level B1 (intermediate) of the Council of Europe’s Common European Framework of Reference for Languages.
    4. Providers of intermediary services which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services.
    5. In the case of non-conformity, providers of intermediary services shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. Furthermore, where the service is not compliant with applicable accessibility requirements, the provider of the intermediary service shall immediately inform the Digital Services Coordinator of establishment or other competent national authority of the Member States in which the service is established, to that effect, giving details, in particular, of the non-compliance and of any corrective measures taken.
    6. Provider of intermediary services shall, further to a reasoned request from a competent authority, provide it with all information necessary to demonstrate the conformity of the service with the applicable accessibility requirements. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements.
    7. Intermediary services which are in conformity with harmonised standards or parts thereof the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements.
    8. Intermediary services which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.
    9. All intermediary services shall, at least once a year, report to Digital Service Coordinators or other competent authorities on their obligation to ensure accessibility for persons with disabilities as required by this Regulation.
    10. In addition to Article 44 (2), Digital Services Coordinators shall include measures taken pursuant to Article 10 new.”Reference

    Reply
  • 0
    0

    (c) the dispute settlement is accessible, including for persons with disabilities, through electronic communication technology;Reference

    Reply
  • 0
    0

    (d) it is capable of settling dispute in a swift, efficient, accessible for persons with disabilities, and cost-effective manner and in at least one official language of the Union;Reference

    Reply
  • 0
    0

    New point g.
    “accessibility of elements and functions of online platforms and digital services for persons with disabilities aiming at consistency and coherence with existing harmonised accessibility requirements when these elements and functions are not already covered by existing harmonised European standards”Reference

    Reply
  • 0
    0

    (a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level in a manner which is accessible for persons with disabilities.Reference

    Reply
  • 0
    0

    new g.
    measures to ensure accessibility for persons with disabilities during implementation of crisis protocols, including by providing accessible description about these protocolsReference

    Reply
  • 0
    0

    Three months seems rather unrealistic. For proper planning, implementation and coordination with other involved parties, a time frame of 18 to 24 months will be necessary.Reference

    Reply
  • 0
    0

    Further clarification on the definitions, along the lines of Recital 13, such as the meaning of ancillary features, could be beneficial to avoid unintentionally capturing non-platform services. As an example, it should be clarified that services whose primary purpose is not the dissemination of information to the public and who are not obviously used for such purposes but who may nevertheless provide users with some basic or limited sharing functionalities are not classified as online platforms.Reference

    Reply
  • 0
    0

    One of the points of general criticism towards the ECD was its partial lack of clarity regarding definitions. This was the case, for example, on the point of actual knowledge. Up to now, it has not been clearly established as to when the criteria of actual knowledge are fulfilled or when they are actually not.Reference

    Reply
  • 0
    0

    Naming fundamental services contributing to the technical infrastructure of the Internet, like DNS, CDN or Registries and confirming that these services can benefit from liability protection under the DSA is a welcome step. However, the framework’s clarity would benefit from including these references into the regulatory part of the proposal, specifying into which category of intermediaries the services fall.Reference

    Reply
  • 0
    0

    A clarification would be welcome, that the exemption shall only be applicable to the specific part of the trader’s platform on which users conclude distance contracts and not be applicable to cases of intermediaries that forward users to a third-party trader’s website where the contract is concluded (one example would be price comparison websites).Reference

    Reply
  • 0
    0

    Recital 36 refers to the possibility of trusted flaggers or other ‘professional entities’ using the SPOC, aside from authorities, the EU Commission and the DSC Board.
    It shall be underlined that, while publishing contact information might allow for ‘easy’ communication, it is challenging and inefficient for companies to have to address a wide variety of unrelated communications through a single, publicly disclosed and available point of contact. This approach is more likely to result in important messages being overlooked, like a needle in a haystack.
    The decision should be left to the intermediaries if they want to have one SPOC for all requests or one point of contact for requests from authorities, another one for exchanges with trusted flaggers, and a third one for any other inquiry.Reference

    Reply
  • 0
    0

    While transparency to a certain degree is a welcomed approach, it can also conflict with intellectual property rights or trade secrets. In addition, transparency is only useful when it is adequate and proportionate.
    If intermediary service providers are obliged to share detailed information on, e.g., measures, tools and algorithms, used to address illegal behaviour or content, it might prevent the above from working efficiently, offering ways of circumvention by malicious users. In addition, while general rules on content moderation have a permanent character, some more granular parts might change according to worldwide developments without an intermediary’s immediate influence. As a consequence, an obligation for an exaggerated level of detail could lead to the T&C becoming an unreliably fluid document. Therefore, intermediaries should be able to refer to a publicly available online-source for details on content moderation in an easily accessible format.Reference

    Reply
  • 0
    0

    It is concerning that the legislator is putting private companies in the position of a judicial actor. The legal interpretation of content cannot, and most definitely should not, lie with an intermediary service provider. Especially considering that the Internet is a global market and even smaller intermediary service providers often operate internationally, a legal interpretation of the respective national law in any possible market constitutes an unsolvable challenge.Reference

    Reply
  • 0
    0

    The scope of Article 15 DSA explicitly addresses (even pure) hosting providers and excludes neither micro nor small or medium-sized enterprises.
    The referred article is to be considered too vague and disproportionate, leading to an exaggerated burden, not only, but especially on MSMEs. Detailed publicly available documentation would further enable users in bad faith to find loopholes and potentially circumvent the removal or blocking of (potentially) illegal content.Reference

    Reply
  • 0
    0

    A six-month period is unrealistic and a disproportionate burden. The Internet is a short-lived medium. If a recipient of a service does not challenge the decision of an online platform within days, it has to be expected that they either do not feel treated wrongly, that the relevance of the action is minor, or that the account is not in active use. For an online platform, on the other side, this long period would increase the number of cases to be brought forward despite a lack of gravity, that data has to be kept on a decision or that the reasoning in the meantime could face changed realities. The period should be decreased to ten weeks at the most.Reference

    Reply
  • 0
    0

    It seems necessary to define exceptional circumstances in which intermediaries are not obliged to offer redress options, including out-of-court dispute settlement, e.g., when the content in question is spam, child sexual abuse material or terrorist content. This further applies to removals based on national authorities’ removal orders (under Article 8 DSA), including where these orders may be confidential and appear as the online platforms’ own decision.Reference

    Reply
  • 0
    0

    Platforms are always obliged to bear their costs for an out-of-court dispute settlement, while the recipients of the service only face the risk of having to bear their expenses – or nothing at best. In addition, the process of arbitration does not include any protective measures against abuse and, as a consequence, actors in bad faith could flood an intermediary with out-of-court dispute settlement procedures, generating costs and slowing down the process for other, legitimate recipients of the service. This approach seems to be unfairly imbalanced.Reference

    Reply
  • 0
    0

    Out-of-court dispute settlement processes should foster transparency and clarity. To that effect, there should be one certified out-of-court dispute settlement body per Member State and such bodies should work towards harmonized approaches and decisions to avoid fragmentation within the Single Market. One potentially helpful measure in this regard could be that settlement bodies take on the responsibility of publishing transparency reports on out-of-court dispute settlements (as foreseen in Article 23 DSA). Finally, out-of-court dispute bodies should be distinct from regulatory oversight bodies and the text should clarify this.Reference

    Reply
  • 0
    0

    […that are manifestly unfounded], insufficiently precise or inadequately substantiated. (cf Article 19.5)Reference

    Reply
  • 0
    0

    (Repetitive) Abusive submission of blocking requests should not only lead to temporary suspension but should trigger liability and compensation obligations for the resulting damage.Reference

    Reply
  • 0
    0

    The system for trusted flaggers proposed in the DSA compels online platforms to work with any awarded trusted flaggers and to arrange certain procedures with them. For this, it would not matter if a trusted flagger interacts with the platform multiple times a day or maybe only once a year. In combination with the SPOC in Article 10, it would even be possible to use a generally published contact, probably via email, to flag content. This can by no means result in an efficient procedure.

    The reference to right-holders is a worrying example of how complicated the system could become very quickly. While it is without a doubt that right-holders are affected by illegal content they are also numerous. To keep a system effective it is, however, necessary to limit the number of trusted flaggers nationally awarded by a DSC.

    In general it is questionable, that the Commission proposes to replace an already working system with a centralisation on a national level. Online platforms and hotlines (like the members of Inhope) have developed a rather efficient system to take down illegal content.Reference

    Reply
  • 0
    1

    The idea of a KYBC approach in the DSA seems reasonable. However, the obligation to gather information on the economic operator, aside from a wide range of details on the trader, appears excessive. For an average online market place, this duty likely results in having to check and maintain dozens to hundreds of references per trader. Paragraph 1 (d) should therefore be deleted.Reference

    Reply
  • 0
    1

    It should be clarified that these provisions, which are aiming at online marketplaces, should not apply to online platforms that do not allow the consumer to conclude a distance contract but act as intermediaries between the user and the third-party trader.Reference

    Reply
  • 0
    0

    The reference in Article 28.1(a) DSA on including the whole Chapter III seems excessive. A carveout of at least Section 5 would be appropriate, not only but also assuming that these will fall under separate monitoring.Reference

    Reply
  • 0
    0

    While Paragraph 6 considers the risk of vulnerabilities for the security or the protection of confidential information, it only does so in the case of ‘significant’ ones and puts VLOPs in the position of having to ‘request’ an adapted request from the DSC. However, the final decision lies again with the DSC or the Commission. This seems insufficient, especially regarding that granting access to databases upon ‘reasoned request’ puts trade secrets at risk, and sharing data externally via databases or APIs introduces unnecessary security risks by having to offer access that can be misused by third parties.Reference

    Reply
  • 0
    0

    By means of delegated acts, the Commission shall be able to lay down the purposes for which the data may be used. By doing so, the Commission would be able to change the purpose of Article 31 DSA without proper checks and balances.Reference

    Reply
  • 1
    0

    This clause is important for helping identify the true identity of online advertisers, including political advertisers. It is essential that advertisers’ identity is verified properly by platforms, and that this information can be checked by external oversight actors such as civil society and journalists.Reference

    Reply
  • 0
    0

    A register of advertisers should be made public, including the link to the the trade register of electoral commission the advertiser is registered with.Reference

    Reply
  • 1
    0

    The risks detailed in Para. 1 need to be more clearly phrased and defined. Broad terms like “civic discourse” and “intentional manipulation” of electoral processes risk being so vague that they will either not be applied, or implemented in a way that may harm freedom of expression and the press.Reference

    Reply
  • 0
    0

    Risk assessments and decisions on mitigation measures should be conducted through an inclusive, transparent and participatory process with participation from digital services coordinators, data protection authorities and media regulators, as well as civil society and rights groups, media associations and representatives of minority and disadvantaged groups. Decisions such as demoting particular kinds of content cannot continue to be taken without any oversight from free speech advocates. Such decisions need to undergo rigorous checks for their potential impact on people’s right to opinion formation, freedom of speech, freedom of the press and non-discrimination.

    Risk assessments reports and mitigation measures need to be made publicly available, at minimum in an edited form.

    The DSA should incorporate obligations to ensure risk assessments are not solely a tick-the-box exercise by the platforms, but take into account these external views.Reference

    Reply
  • 1
    0

    the natural or legal person on whose behalf the advertisement is displayed, including the contact information and link to official registration number of the entity for legal persons;Reference

    Reply
  • 1
    0

    This line seems to refer to the audience reached, rather than the intended audience by the advertiser. It thus seems to refer to the platform-internal parameters that decide who to target, rather than the targeting criteria advertisers choose. This line should clarify this and include the need for both targeting criteria and audience reached to be publicly displayed.Reference

    Reply
  • 0
    0

    This should include the right to privacy and freedom of opinion formation – both essential to democracy.Reference

    Reply
  • 0
    0

    The term “illegal” is already referring to content that is “not in compliance with Union or MS law”, so this is already coveredReference

    Reply
  • 0
    0

    Certain mitigation measures may amount to limitations on freedom of speech, yet these decisions are taken without any public scrutiny. As with the assessment of systemic risks, measures should be put in place to ensure the mitigation of risks is an inclusive, participatory process involving relevant stakeholders beyond the private sector.Reference

    Reply
  • 0
    0

    – While audits are an excellent accountability mechanism, such audits should be conducted by a public oversight agency such as the European Board of Digital Services, rather than by private sector auditors. This is necessary to ensure consistent standards and methodologies for auditing over time and across platforms. Moreover, it would mitigate the risk of revolving doors between Very Large Online Platforms and the very few auditing firms who would be capable of auditing Very Large Online Platforms.Reference

    Reply
  • 0
    0

    Article 29 should include the obligation for platforms to proactively inform users of their options regarding recommender systems.Reference

    Reply
  • 0
    0

    In order to ensure meaningful transparency, the advertising repositories of Very Large Online Platforms should also include:
    – User access to their dynamic advertising profiles;
    – Audience reached;
    – Engagements with the ad;
    – GDPR basis for data processing and data source.

    In addition, the Commission needs to set clear standards for advertising transparency that are flexible and can be changed over time, with which VLOPs need to comply with the threat of sanctions upon non-compliance.Reference

    Reply
  • 0
    0

    This refers to content moderation by the service provider. The notion that communities also do their own content moderation is completely forgotten. I am unsure whether it would be tactically wise to build this distinction in here, as it could backfire. But if we want a real horizontal framework for the EU, it is something worth discussing.Reference

    Reply
  • 0
    0

    Consumer protection, product safety and the protection of minors must be defined as explicit legal objectives in Article 1.2, building on recital 34.Reference

    Reply
  • 0
    0

    BEUC welcomes that the EU Charter of fundamental rights is mentioned not only in Art. 1.2, but also in other provisions. However, despite being enshrined as a fundamental principle in Article 38 of the Charter of Fundamental Rights, consumer protection is not mentioned when the proposal refers to some articles of the Charter. We would like this to change. See in particular Articles 26.1,b), 37.4,e), 41.6; and recitals 3, 41, 57.Reference

    Reply
  • 0
    0

    While the criteria proposed is not cumulative (which is good), BEUC considers that the criterion to have “a significant number of” EU users is included within the other criterion. Therefore, it can be deleted.Reference

    Reply
  • 0
    0

    BEUC recommends deleting ‘a significant number of users in one or more Member States’ from Article 2 d) and recital 8 as this criterion would be included within the criterion to target activities to one or several EU Member States. The second criteria (“the targeting of activities towards one or more Member States”) is in line with Private International Law and CJEU case law.Reference

    Reply
  • 0
    0

    While we understand this concept includes both traders and consumers, sometimes the proposal gives the impression it is only referring to business users. In the same vein, this definition is different from Article 2.d) of the e-Commerce Directive. The distinction between traders and consumers on the one hand and recipients of the service on the other should always be clear.Reference

    Reply
  • 0
    0

    While the definition is ok for us (as the DSA is not meant to define when content is legal or illegal. Other EU and Member State laws do so), the use of this term should be more coherent and consistent throughout the text. Illegal content and ‘activities’ are sometimes used interchangeably in the articles. Similarly, information should not be treated the same as ‘goods’ or ‘services’.Reference

    Reply
  • 0
    0

    The exclusion of messaging services like WhatsApp in this recital makes sense to for example ensure encryption is not weakened to moderate content. However, BEUC notes that messaging apps may in the future become sales channels , in which case some DSA rules (like the traceability of traders under Article 22) should apply to them. Therefore, they should not fully be exempted from the scope of application.Reference

    Reply
  • 0
    0

    The definitions of ‘online platform’ and ‘dissemination to the public’ (and related recitals) should be assessed carefully to ensure that all relevant companies and their services would be subject to relevant DSA obligations.Reference

    Reply
  • 0
    0

    the definition of ‘advertisement’ should include both direct and indirect ways to promote, market or rank information, products or services. In the same vein, BEUC witnesses indirect and direct forms of remuneration. Both should be included. This is important because, for example, some online influencers do hidden marketing of products without revealing that they are paid for it.Reference

    Reply
  • 0
    0

    The definition of ‘recommender system’ should not only be limited to “suggestions” but also ranking and prioritisation techniques. This change would bring the definition in line with recital 62, as recommended by the European Data Protection Supervisor (EDPS).Reference

    Reply
  • 0
    0

    The definition is good, but it is important to clarify in a recital that terms and conditions should be in line with applicable EU and Member State laws. While it may seem obvious, this is major issue. For example, Recently, following an action brought forward by consumer organisation UFC-Que Choisir, the French Tribunal de Grande Instance ordered Twitter to delete more than 250 abusive clauses , condemned Google for using 209 unfair or illegal contract clauses and found Facebook had 430 abusive or illicit clauses.Reference

    Reply
  • 0
    0

    It is problematic that recital 17 would prevent from establishing a positive liability on marketplaces. It is also problematic that it establishes that “the exemptions from liability established in this Regulation should apply in respect of any type of liability as regards any type of illegal content, irrespective of the precise subject matter or nature of those laws.” Other laws such as the GDPR or Directive 2019/771 (the ‘Sales Directive’) establish and/or allow Member States to establish positive liability for certain types of platforms. Recital 17 must be amended to ensure it does not preclude the establishment of a positive liability on online marketplaces and does not apply to questions relating to information society services covered by other EU or Member State laws.Reference

    Reply
  • 0
    0

    There should be a broad understanding about what online marketplaces are. Also, BEUC proposes article 5.3 to be amended to ensure online marketplaces and traders can be jointly and severally liable:
    • For non-compliance of their due diligence obligations. For example, if a marketplace fails to demonstrate it verifies traders following Article 22 of the DSA.
    • For damages, when failing to act upon obtaining credible evidence of illegal activities, without incurring into a general duty to monitor the activity of platform users .
    • For damages, contract performance and guarantees:
    1- for failure to inform consumers about the supplier of the goods or services, in line with Article 4.5 of the Omnibus Directive introducing the new Art. 6a.1,b) of the Consumer Rights Directive and CJEU Wathelet case C 149/15;
    2- for providing misleading information, guarantees, or statements;
    3- where the platform has a predominant influence over suppliers or the transaction. Such predominant influence or control could be inferred by non-exhaustive and non-cumulative criteria that would be assessed on a case-by-case basis by courts. Following research by the European Law Institute, we would like to suggest the following indicative criteria :
    a) “The supplier-customer contract is concluded exclusively through facilities provided on the platform;
    b) The platform operator withholds the identity of the supplier or contact details until after the conclusion of the supplier-customer contract;
    c) The platform operator exclusively uses payment systems which enable the platform operator to withhold payments made by the customer to the supplier;
    d) The terms of the supplier-customer contract are essentially determined by the platform operator;
    e) The price to be paid by the customer is set by the platform operator;
    f) The marketing is focused on the platform operator and not on suppliers; or
    g) The platform operator promises to monitor the conduct of suppliers and to enforce compliance with its standards beyond what is required by law.”

    To be proportionate, marketplaces should enjoy a right to redress towards the party at fault . This is precisely what was done under Article 82 of the General Data Protection Regulation (GDPR). For this, both Article 5.3 and related recitals must be amended. Recital 17 must be deleted or amended to ensure it does not prevent establishing a positive liability regime for online marketplaces and does not hamper existing legislation.Reference

    Reply
  • 0
    0

    General comment: as long as a hosting service provider is not exempted from liability in line with Article 5.1, the DSA should establish that consumers can exercise against the intermediary service provider all the rights and remedies that would be available against the trader, including compensation for damages, repair, replacement, price reduction, contract termination or reimbursement of the price paid . In addition, specific remedies for consumers shall be foreseen in case the intermediary service provider is in breach of its own obligations listed in this Regulation.Reference

    Reply
  • 0
    0

    BEUC is very sceptical about introducing a “Good Samaritan”-type clause to add more protections to intermediary service providers that adopt “voluntary” actions. BEUC recommends deleting Article 6 and recital 25 or at least narrowing them down as much as possible, to ensure any voluntary actions taken by intermediary service providers are designed to be effective.Reference

    Reply
  • 0
    0

    BEUC recommends deleting Article 6 and recital 25 or at least narrowing them down as much as possible, to ensure any voluntary actions taken by intermediary service providers are designed to be effective. If policymakers manage to ensure obligations in the DSA are clear and strong, companies will not need to go beyond the law and take (unenforceable) voluntary actions. Democratic institutions should decide what platforms should do. Consumers need further protection, not platforms. In the event that Art. 6 is retained, measures to be taken must be specified, e.g., via the platforms’ terms and conditions. Overall, Art. 6 is very vague, which leads to strong legal uncertainty.Reference

    Reply
  • 0
    0

    Articles 7, 22 and recitals 28 and 50 should not prevent online marketplaces from being obliged to conduct periodic checks on trader accounts and the products and services they facilitate offering.Reference

    Reply
  • 0
    0

    BEUC questions the exclusion of “small” companies from all platform-related obligations under this section. To the very least, article 16 should only exclude micro enterprises with the exception of Articles 17 (internal complaint mechanism), 18 (out-of-court dispute settlement), 22 (traceability of traders), 24 (online advertising) and 29 (recommender systems*).

    *Article 29 should not only be limited to very large online platforms.Reference

    Reply
  • 0
    0

    - the threshold should be lower and “active” should not be a requirement. It is not clear what “active” would mean and it may not be an appropriate qualitative criterion to indicate whether a platform poses systemic or significant risks. Any loopholes or uncertainties in how this is calculated could be used by companies to try to prevent falling under the scope of the DSA obligations.
    – It should be amended to consider local thresholds. E.g. if a company is a VLOP in their country but not in others, that Member State should be able to establish additional obligations.
    – It should include emerging companies, i.e. those companies that are not yet very large platforms but have an exponential increase and therefore increased risks for consumers.
    – Section 4 of Chapter III to apply at the same time as the rest of the Regulation, without granting additional delays. After all, we are talking about the biggest companies with biggest resources. It is unacceptable that for these rules to apply to major players, we would need to first wait to get Digital Service Coordinators appointed (“two months from the date of entry into force of this Regulation”, Art. 38.3), wait for the European Board for Digital Services to be constituted, get a methodology agreed via a delegated act – with an opinion of the Board – (Art. 25.3), then wait for Digital Service Coordinators to build a list and wait for platforms to be notified by them and then wait four months more after the publication of such a list in the Official Journal of the European Union (Art. 25.4). This process is not proportionate and would create unacceptable delays.Reference

    Reply
  • 0
    0

    Having an internal complaint handling mechanism (Art. 17) should apply to all hosting service providers, small ones included. To solve this problem, BEUC suggests including an internal complaint handling mechanism within the notice and action mechanism.Reference

    Reply
  • 0
    0

    The obligation to adhere to an alternative dispute settlement mechanism (Art. 18) should be applicable to all platforms, small ones included. This can actually be beneficial for small businesses to avoid high litigation costs. Article 17 of the e-Commerce Directive already provided for the possibility to provide out-of-court dispute settlement, without distinguishing entities depending on their size.Reference

    Reply
  • 0
    0

    Article 29 should apply to all platforms, not just very large ones. This article should be moved to (or replicated in an improved manner in) Section 3 of Chapter III.Reference

    Reply
  • 0
    0

    BEUC recommends adding a new article in Section 2 of Chapter III, applicable to all hosting service providers. One paragraph should make sure that the DSA ensures that hosting service providers do not design their online interfaces and/or parts thereof in a way to engage in so-called ‘dark patterns’ . A separate paragraph should bring article 22.7 here, in a modified form.
    Art. 22.7 is very welcome but should go beyond pre-contractual and information requirements. Currently, article 22.7 is only meant for platforms to support traders in complying with their obligations. It does not include platforms’ compliance with their own obligations.

    Platforms should design their websites and applications in a way consumers are not pushed to take certain decisions which are in the benefit of the platform or trader in question and not necessarily in the interest of consumers. The design interface should also ensure traders can easily comply with consumer and product safety laws, not just parts of them (e.g., right of withdrawal, terms and conditions). Likewise, traders that do not fulfil their obligations under consumer and product safety legislation should not be put on the platform or be suspended.Reference

    Reply
  • 0
    0

    This figure would act similarly as that of a Data Protection Officer (DPO) under the GDPR, which has proven to bring benefits to companies and consumer trust. It is important that the companies provide officers with the necessary powers to conduct their tasks; and that there are protections, so they are not dismissed for trying to ensure compliance with the DSA and other applicable laws. Digital Service Coordinators and/or the Commission should be obliged to communicate the names of the compliance officers to the authorities within the Board.Reference

    Reply
  • 0
    0

    BEUC suggests adding more clarity as to what “necessary powers and resourceS” will mean in practice. It is important legal representatives have the power to ensure effective and swift compliance with the DSA on behalf of the company represented.Reference

    Reply
  • 0
    0

    Policymakers should ensure rules are enforceable both against non-EU players that target services at EU consumers, but also against non-EU traders that use online intermediaries.Reference

    Reply
  • 0
    0

    BEUC recommends adding in Article 12 that terms and conditions must disclose all remedies available, including applicable alternative dispute resolution mechanisms independent from the company.

    In addition, this article should require the provision of a very short, clear and user-friendly summary of key T&C for consumers, taking inspiration from Article 102 of the European Electronic Communications Code.
    This summary should include remedies and redress mechanisms available.

    Finally, in line with Article 6.1 d) of the Consumer Rights Directive , it is important that the article also ensures online marketplaces clearly inform consumers about the fact that consumers enter into contracts with both the marketplace provider and a trader .Reference

    Reply
  • 0
    0

    BEUC recommends improving Articles 13, 23 and 33 to ensure reports are written in objective terms (not as marketing tools) and follow a consistent methodology. Competent authorities, researchers and public interest NGOs (consumer organisations included) should have the ability to request raw data, redacting personal data, and prove the statistics conveyed are verifiable. It is important the reports include actions taken for different types of illegal content, not just a few. For instance, we notice that among currently available reports, a lot of focus is put on intellectual property rights notice and takedowns but not so much on product safety or consumer law violations, if at all. Similarly, transparency reports should include statistics on the audience/users that use the company’s services.Reference

    Reply
  • 0
    0

    Article 14 should distinguish between notices from companies and notices from individuals. In addition, the action taken by platforms should take into account different types of illegal activities.Reference

    Reply
  • 0
    0

    Paragraph 3 could be further clarified so intermediary service providers are obliged to adopt reasonable decisions on removals or disabling access to content. This will counter the incentive to automate removals or blocking upon receiving a mere notification.

    However, from a consumer perspective, it is important that a notice by a consumer or a consumer organisation concerning, for example, the sale of an illegal product, ensures prompt removals and that the notice and action mechanism in the DSA cannot be misused as an excuse for platforms not to act or to claim the notice did not lead to awareness or knowledge of an illegal activity. To preserve freedom of expression and address complex balancing-acts, it may be relevant to differentiate between different types of illegal activities and notices depending on the stakeholder that flags them. A complementary solution is to explicitly have greater safeguards against overremovals (asking for a diligent check and reinforce or connect this with safeguards the Commission has proposed for platforms e.g., safeguards against abusive notifiers, right to reinstate legal content during the appeal process, etc). It is important to assess this provision not only from the perspective of freedom of expression but also from a consumer protection and product safety perspective. Contrary to evidence of overblocking found in the field of copyright, for example, consumer organisations have witnessed the big problem of ‘underremovals’ for consumer or product safety law violations, leading to unhindered proliferation of illegal products on platforms.Reference

    Reply
  • 0
    0

    We need to ensure Article 14 also applies when platforms treat notices as terms of service violations. Current practice shows platforms have a notice and action mechanism in light of their terms and conditions, not the law, and their scope tends to be rather limited. There should always be clear options to report e.g., consumer and product safety law violations as well.

    Regardless of how the user evaluates the content (illegal content/violation of the platform’s T&Cs, even after a further request from the platform), the process must be carried out following the notice and action mechanisms of Article 14 (unless any other law provides for a content-specific mechanism).

    It is also important that the mechanism to provide notices is not hidden and is well integrated in the way companies (notably platforms) present content. For example, a consumer should be able to report a potentially illegal activity while shopping, scrolling on a feed, when seeing an ad on a search engine and after a purchase.Reference

    Reply
  • 0
    0

    Providing a URL will not always be needed to determine which content a notifier is referring to. E.g., if you report a tweet, Twitter knows the location of a tweet. In other platforms like Facebook sometimes you cannot copy-paste a link, so this requirement may be difficult to fulfil. However, a URL may be needed for notices to other types of internet service providers.Reference

    Reply
  • 0
    0

    Consumer groups have noticed that some platforms provide quite restrictive choices when reporting content (e.g., multiple choice checkboxes), which may limit what consumers are actually able to complain about. The DSA must ensure reporting mechanisms are comprehensive.

    Ultimately, the notice mechanism should be user-friendly, comprehensive and not too difficult for consumers to use or find.Reference

    Reply
  • 1
    0

    Hosting service providers should also be required to provide a comprehensive and detailed answer as to why they decide not to remove or disable access to illegal content referred to by a notice.Reference

    Reply
  • 1
    0

    – Overall, there are insufficient requirements for ensuring the quality and independence of platforms’ internal complaint-handling mechanism.
    – Intermediary service providers need to be obliged to disclose the rules of procedure of their internal complaint-handling mechanism in their T&C, but also presented to the consumer clearly when willing to complain.
    – The rules of procedure must provide for an established deadline to deal with the complaint internally. This would be a tool for consumers to know that after a certain number of days, if they do not have a response, they can go to the ADR body. With a deadline, platforms will also be more likely to create well-structured and resourced internal procedures.
    – The internal complaint-handling mechanism should not be seen by consumers as the only alternative to obtain remedies. When proposing the internal complaint mechanism, online platforms should also be required to inform consumers that, alternatively, they have the possibility to refer their claim to the certified ADR body.Reference

    Reply
  • 0
    0

    This recital specifies that the DNS industry is to be considered an online intermediary, yet none of the definitions for the three categories of intermediaries (mere conduit, hosting and caching) match what the DNS industry does. Either the definitions are amended, or (better) a fourth category of intermediaries for “network directory services” should be added.Reference

    Reply
  • 0
    0

    This exception is too strict, as it does not cover two other cases in which intermediaries (especially ISPs and DNS providers) could – sometimes must, for effect of other regulation – carry out activities to block content:
    1. disabling access to non-illegal content on request by the user, e.g. parental controls on Internet access accounts used by children;
    2. disabling access to non-illegal harmful content such as network security threats (phishing websites, botnets).

    To do this, the “illegal content” language should be expanded by also adding “other content whose removal has been requested by the recipient of the service” and “other content that has been defined as harmful in the specific intermediary service by industry codes of conduct and recognized technical standards”.Reference

    Reply
  • 0
    0

    – Very important: this article should apply without prejudice to the Consumer ADR Directive, i.e. Directive 2013/11/EU (like recital 45 says).
    – BEUC recommends having one (or only a few) ADR providers certified at national level.
    – ADR entities must comply with strong quality requirements ensuring their independence from platforms, their autonomy and their impartiality. Inspiration should be drawn from the Consumer ADR Directive (articles 6 et seq.). Compliance with such quality criteria should be assessed by authorities on an on-going basis.
    – ADR entities under the DSA should have an obligation to draw up annual reports highlighting inter alia the number of complaints received, any systematic or recurrent problems, the average time taken to resolve a dispute (this would mirror article 7.2 of the Consumer ADR Directive). When doing so, they should base their analysis on but should not be limited to the information submitted by platforms under Article 23 of the DSA. This is because ADR bodies need to provide their insights in an independent manner. This would be very useful both to address any serious incompliances with the DSA and, if the gaps are identified, for further eventual improvements of the DSA and its enforcement.
    – There should be a deadline for the ADR providers for processing complaints. Under Article 8 of the Consumer ADR directive, it is not more than 90 days.
    – BEUC also recommends the establishment of a network of ADR entities and they should exchange information.Reference

    Reply
  • 0
    0

    – BEUC particularly welcomes that platforms will not be able to designate ‘trusted flaggers’, but authorities will (Art. 19.3) and the Commission will ensure the list is public and up-to-date (Art. 19.4). BEUC also welcomes the proposal has safeguards against abuse (Art. 19.5-7). However, BEUC considers improvements are needed.
    – Article 19.2 should clarify the criteria to be eligible as a ‘trusted flagger’. In addition, the acquisition of such status should not entail an obligation on trusted flaggers to constantly monitor platforms or all types of content. Platforms should not simply outsource their work to trusted flaggers. Platforms should also take responsibility. For example, consumer groups have repeatedly flagged unsafe products to marketplaces, but this cannot become the modus operandi to keep consumers safe.
    – There should be safeguards in place to prevent that notices by individual consumers or civil society organisations that are not (yet) recognised as such or that do not have the capacity to engage in such a role. In fact, not all NGOs, consumer organisations included, have the resources or interest to engage in such a role, but could nevertheless report illegal activities online to platforms.Reference

    Reply
  • 0
    0

    BEUC recommends that platforms devote best efforts to ensure suspended traders cannot re-join until the suspension is lifted. In some cases, the permanent exclusion of traders would be justified.Reference

    Reply
  • 0
    0

    There is a general issue of jurisdiction: it is not so clear whether the “relevant national judicial or administrative authorities” are those of where the provider is based, or those of where the user is located. The former interpretation clearly disadvantages European service providers in respect to non-European ones, as the latter can elect their EU representation in countries which do not block content, and then unfairly compete with providers in other EU countries that are required to block content. For example, Italy blocks a long list of illegal content websites via DNS filters but people can just move to the DNS service of Google to bypass any filter and access illegal content at will.

    This is a perverse situation that should be addressed by requiring providers that have a significant market presence in a member State to apply the rules of that member State to users that come from there, even if the provider is based in a different member State.Reference

    Reply
  • 0
    0

    A duty to promptly inform law enforcement or judicial authorities should apply not only when the life or safety of individuals is threatened under criminal law, but also when online platforms become aware of other illegal activities such as fraudulent and scam ads, the sale of illegal products online by traders.Reference

    Reply
  • 0
    0

    Interoperability should be a requirement for dominant players, not a voluntary choice.Reference

    Reply
  • 0
    0

    Art. 22.1 d) is very important for the traceability of products and accountability of traders. The online platform must make sure it obtains prior to selling the information about the economic operator in line with Article 4 of the Market Surveillance Regulation – in essence, it says the platform must know who the producer is (or the importer or the authorised representative depending on who is in charge) and have also the relevant contact details. This article in the DSA is also logical because some other parts of the Market Surveillance Regulation require platforms to meaningfully cooperate with the enforcement authorities and handing over such contact details is the most basic thing we can expect from platforms.
    Letter d is also essential in case we’re facing non-EU traders as the marketplaces should check that the authorised representative of the trader exists. Otherwise, who’s accountable vis-à-vis the consumer? How do we enforce the law against non-EU traders that use platforms to circumvent the law?Reference

    Reply
  • 0
    0

    Article 22.1 d) must stay. This is very important for the traceability of products and accountability of traders, The online platform must make sure it obtains prior to selling the information about who the economic operator is, in line with Article 4 of the market surveillance regulation – in essence, it says that the platform must know who the producer is (or the importer or the authorised representative depending on who is in charge) and have also the relevant contact details. This is also logical because some other parts of the market surveillance regulation require platforms to meaningfully cooperate with enforcement authorities and hand over such contact details is the most basic thing we can expect from them. Finally, this provision must stay to ensure the marketplace verifies that third-country traders have a European branch or an authorised representative in Europe. This is essential to ensure compliance of product safety law.Reference

    Reply
  • 0
    0

    They should not only devote best efforts to verify that the information is reliable, but that it is complete and up-to-date as well.Reference

    Reply
  • 0
    0

    1. The exclusion from the general liability exemption for online marketplaces should be extended and specified. Traders are acting under the authority or control of an online marketplace if, for example
    – the name of the online marketplace is in the foreground of the offer presentation,
    – the online marketplace has control over the payment system or sets the price,
    – the online marketplace gives the impression that it controls or monitors the behaviour of the sellers, or
    – the online marketplace essentially dictates the terms of the contract.
    These criteria should be included in the DSA.
    2. From the vzbv’s point of view, it is not understandable why the exclusion from the general liability exemption is limited to liability under consumer protection laws. It leads to great legal uncertainty. In recent years it has been repeatedly the subject of litigation which requirements fall under consumer protection regulations and which do not (for example, data protection or product safety regulations).
    3. In addition, the restriction “average and reasonably well-informed consumers” must be deleted. All consumers should be protected from being misled. Accordingly, the exemption should not only apply when the “average and well-informed” consumer is concerned.
    4. Finally, the scope of Art. 5 (3) must be extended so as to include that the liability exemption also does not apply if the online marketplaces violate the due diligence obligations listed in Art. 22. This must also be accompanied by the amendment of recital 17.Reference

    Reply
  • 0
    0

    1. In order to prevent a “free ticket” for platforms and to increase legal certainty it needs to be clarified what could be understood by “voluntary own-initiative investigations”.
    2. In addition, it should be clarified in Art. 6 that platforms do not benefit from the liability privilege if they become aware of illegal content through their own initiatives and do not react.
    3. Finally, reducing liability risks for platforms, as is done in Art. 6, must be accompanied by stronger preventive safeguards for users through measures such as pre-flagging, delayed takedown, and stay up obligations.Reference

    Reply
  • 0
    0

    Based on previous experience the vzbv is sceptical about the concept of a legal representative. Similar concepts are known from other legislation, such as the regulation on cosmetic products or the General Data Protection Regulation (GDPR). Experiences have shown that authorized representatives often have no real means of action against the companies themselves – also because they have no real assets that can be retained in the case of infringements. After the adoption of the GDPR, a new business model has emerged. Among others, law firms take over the representation for non-EU companies according to Art. 27 GDPR. In such constellations, both parties (platform and legal representative) have a contractual relationship, but no liability relationship. Effective enforcement of rules is partially made more difficult by such a system. In an emergency, the legal representative may open the mail from the court and forward it. It may nevertheless be difficult to take action against the platform. Direct responsibility of the platforms is therefore always preferable.Reference

    Reply
  • 0
    0

    1. The exclusion from the general liability exemption for online marketplaces should be extended and specified. Traders are acting under the authority or control of an online marketplace if, for example
    – the name of the online marketplace is in the foreground of the offer presentation,
    – the online marketplace has control over the payment system or sets the price,
    – the online marketplace gives the impression that it controls or monitors the behaviour of the sellers, or
    – the online marketplace essentially dictates the terms of the contract.
    These criteria should be included in the DSA.
    2. From the vzbv’s point of view, it is not understandable why the exclusion from the general liability exemption is limited to liability under consumer protection laws. It leads to great legal uncertainty. In recent years it has been repeatedly the subject of litigation which requirements fall under consumer protection regulations and which do not (for example, data protection or product safety regulations).
    3. In addition, the restriction “average and reasonably well-informed consumers” must be deleted. All consumers should be protected from being misled. Accordingly, the exemption should not only apply when the “average and well-informed” consumer is concerned.
    4. Finally, the scope of Art. 5 (3) must be extended so as to include that the liability exemption also does not apply if the online marketplaces violate the due diligence obligations listed in Art. 22. This must also be accompanied by the amendment of recital 17.Reference

    Reply
  • 0
    0

    1. In order to prevent a “free ticket” for platforms and to increase legal certainty it needs to be clarified what could be understood by “voluntary own-initiative investigations”.
    2. In addition, it should be clarified in Art. 6 that platforms do not benefit from the liability privilege if they become aware of illegal content through their own initiatives and do not react.
    3. Finally, reducing liability risks for platforms, as is done in Art. 6, must be accompanied by stronger preventive safeguards for users through measures such as pre-flagging, delayed takedown, and stay up obligations.Reference

    Reply
  • 0
    0

    Based on previous experience the vzbv is sceptical about the concept of a legal representative. Similar concepts are known from other legislation, such as the regulation on cosmetic products or the General Data Protection Regulation (GDPR). Experiences have shown that authorized representatives often have no real means of action against the companies themselves – also because they have no real assets that can be retained in the case of infringements. After the adoption of the GDPR, a new business model has emerged. Among others, law firms take over the representation for non-EU companies according to Art. 27 GDPR. In such constellations, both parties (platform and legal representative) have a contractual relationship, but no liability relationship. Effective enforcement of rules is partially made more difficult by such a system. In an emergency, the legal representative may open the mail from the court and forward it. It may nevertheless be difficult to take action against the platform. Direct responsibility of the platforms is therefore always preferable.Reference

    Reply
  • 0
    0

    1. Art. 14 reveals the weakness of an insufficient differentiation between the business models of the platforms in the DSA.
    With regards to interaction platforms, the vzbv expressly welcomes the fact that the European Commission has not introduced rigid time limits for deletion of content. In doing so, it is taking a different approach than Germany with its Network Enforcement Act (NetzDG). It is often difficult to determine the extent to which a post is illegal or covered by freedom of expression. In this respect, rigid time limits are not necessarily beneficial.
    However, the situation is different with regards to online marketplaces. It should be made clear that products e.g. listed in the RAPEX system or without CE marking must be deleted within a maximum of 48 hours. The voluntary Product Safety Pledge shows that online marketplaces actually have the ability to do so.
    2. The insufficient differentiation between the business models is also problematic in conjunction with the liability privilege in Art. 5 (1): According to Art. 14 (3), it is foreseen that if someone sufficiently precisely and adequately substantiated notices a content as potentially illegal (Art. 14 (2)), this is de facto equivalent to give rise to actual knowledge. The likely consequence is that just by receiving a formally correct notice, the risk of liability for the platform is so high that it removes the content without further checking of its own. Thus, it is no longer sufficiently ensured that the platform provider first carries out its own assessment before the content is removed (for the time being). This possible automatism should be countered with a clarification in the legal text.
    3. Finally, Art. 14 (1) requires a clarification. Regardless of how the notifier or the platform legally assesses the content, the notification is processed via the N&A mechanism of Art. 14. This ensures that the given procedural safeguards also apply if, for example, the notifier reports a violation according to the rules of the platform (TOS, house rules, community standards). Otherwise, there is a risk that the platform will direct users to the platform’s own complaints mechanisms instead of a predefined, regulated and transparent procedure for reporting potentially illegal content, which deviates from the specifications made to the detriment of the users.Reference

    Reply
  • 0
    0

    Recital 33 can be understood as a remarkable exemption from the country-of-origin principle. To ensure that courts and authorities make use of this possibility for cross border actions recital 33 needs to be codified an article. This would create legal certainty with regards to the contradicting regulation of the e-commerce directive and ensure that it is not overruled and the underlying intention of recital 33 is put into praxis by the national courts.Reference

    Reply
  • 0
    0

    Recommendation: Oblige online platform to maintain a point of contact in every member state that is accessible for users and not only authorities.

    The point of contact should guarantee for legally secure delivery of requests and be accessible in one of the country’s official languages. This can be as simple as mandating a law firm. Otherwise, users need to search the imprint for a valid postal address and have to struggle with international delivery of legal documents eventually, which is not user-friendly and discourages them to question content decisions.Reference

    Reply
  • 0
    0

    Recommendation: Codifying an obligation to delete illegal content in an article

    The proposal refrains from determining an obligation to remove or block illegal content. This leads to the conclusion that the removal of content will not be enforceable by authorities and cannot be subject to a complaint to the Digital Services Coordinator even if contravention is practiced exhaustively. Thus the liability remains highly theoretically as it cannot be enforced as an infringement of this proposal and on the other will most likely not be enforced by users either due to a lack of access to justice and missing access to redress mechanisms if their notification is rejected by the platform.Reference

    Reply
  • 0
    0

    We recommend to lower the requirements for the presumption of proof for liability according to Art. 14 sec. 3

    We consider this standard of actual knowledge to be unreasonably strict as it demands excessive and unnecessary information from users and gives advantages to the platforms that already profit from extensive privileges in the art. 3 – 7. The obligation to provide the user’s name, e-mail address and URL of the content should be excluded. Especially those affected from digital violence are highly sensitive about revealing their data.
    Especially victims that experienced digital violence before will not provide any personal data out of fear that their name or e-mail address is revealed to the counter party, especially when the content the want to report does not insult or defame themselves but is inciting or racist, in this case their data is not necessary at all. To provide the URL is even impossible if the platform is used on a mobile device which happens more and more.Reference

    Reply
  • 0
    0

    The proposal does not provide a deadline for the assessment of notifications. From our experience “in a timely manner” is an insufficient benchmark, as it lacks a specific definition and therefore will be stretched to the limit by online platforms. Therefore, users shall be notified within 7 days about the platform’s decision upon their notification. Experience from the network enforcement act has shown that the platform’s interpretation of such terms is always stretched to the maximum, which is why clear rules for assessment is necessary also to give legal certainty for users who report illegal content and do not get a reply from platforms to assess their prospect for legal steps.Reference

    Reply
  • 0
    1

    According to Art.15, no statement of reasons is required by the platform if it decides not to act subsequently to a notification of illegal content. This deprives the users of the opportunity to soundly assess the prospects of judicial review or seek low-threshold and inexpensive reassessment.Reference

    Reply
  • 0
    0

    Recommendation: Obliging platforms to create a notification procedure that is clearly visible, low-threshold and located close to the content in question

    The current wording of the proposal is similar to that of the Network Enforcement Act (NetzDG). We know from experience that the interpretation of those requirements will be stretched to the limit by online platforms. Therefore, we cannot expect the notification procedure to be very user-friendly and see an urgent need for clarification.Reference

    Reply
  • 0
    0

    Make internal complaint handling and out of court-settlement accessible to all users

    If the platform decides not to act subsequently to a notification of illegal content, the affected users won’t have access to the internal complaint handling system, art. 17, and Out-of-court dispute settlement, art. 18. Put into practice, this would mean that the situation of users whose notification is rejected by the platform does not change under the rule of the proposal. The users only option is to hire a lawyer and take legal action at their own financial risk.Reference

    Reply
  • 0
    0

    Recreate the out of court settlement mechanisms and replace it by mandatory summary proceedings for content decisions in all member states

    An “external body” is unsuitable to meet the desired standard to ensure freedom of expression and rule of law. It is incomprehensible why a non-defined external body should be entitled to make a binding decision. We recommend finding a regulation instead that improves access to justice for all users towards online platforms to seek efficient judicial review. Binding content decisions need to be reserved for courts and summary proceedings can significantly improve access to justice for all users towards online platforms. There is no need in installing a parallel infrastructure that is not even cost free for users.Reference

    Reply
  • 0
    0

    Trusted flaggers can only be supplement to a broader strategy. To rely on NGOs to proactively search for illegal content instead of the online platforms themselves or law enforcement implies shifting the burden onto civil society and taxpayers. It also means sparing the online platforms who are profiting economically from the traffic on their platforms but are not obliged to also economically cover the negative side effects.Reference

    Reply
  • 0
    0

    Do not suspend users from notification procedure

    It cannot be expected that users can assess the lawfulness of content and it is not their task either. Negative consequences of misuse of notification results from wrongful and often automated complaint handling by the platform and waiver of human oversight. Especially members of marginalized groups with a different understanding of discrimination and people with little legal knowledge are about to be put in danger of suspension from notification procedures due to too many unjustified reporting.Reference

    Reply
  • 0
    0

    Rethink oversight under consideration of the country-of-origin principle. Keep in mind that all social media platforms are registered in Ireland and that Ireland will be in charge to manage oversight. This enables platforms to cherry-pick their regulator.Reference

    Reply
  • 0
    0

    Smaller platforms registered outside of the EU currently create a safe haven for all sorts of illegal content. To ensure public safety from these services we need reliable options to restrain these services or even block them in the EU, when they not comply and do not appoint a legal representative.Reference

    Reply
  • 0
    0

    It should also be regulated what happens with the content during the review process of the platform and in the context of out-of-court dispute resolution proceedings or legal proceedings (take-down or stay-up during the review process, since at this point it is unclear whether the content is illegal):
    In case serious and clear violations of the law the content should be deleted immediately, if an automated check limited to the essentials (plausibility check) shows that the notice is valid. These particularly serious infringements could include, in particular, content glorifying violence, national socialist statements, cruelty to animals or the offering of products listed in RAPEX.
    In case of other violations of rights a stay-up obligation arises, provided that the content was flagged as lawful and the plausibility check shows that the flagging is not obviously – i.e. easily recognizable for anyone at first glance – incorrect (pre-flagging and delayed takedown). The platform operator shall be exempted from liability for the flagged content until the verification procedure has been completed.
    The risk of abuse that is inherent in both an indication of an infringement and a pre-flagging possibility could be countered by applying the solution found in Art. 20.Reference

    Reply
  • 0
    0

    The vzbv welcomes the possibility that content provider shall be entitled to select out-of-court dispute. However, private users should not have to bear the costs of this procedure. In addition, the anonymity of the procedure should be guaranteed. For example, it should be ensured that the conciliation body may only disclose the personal data of the plaintiff or the user for whom the content complained about was stored with their consent. From the user’s point of view, ensuring low-threshold and, where appropriate, anonymous access to remedies and the absence of cost risks is crucial to actually engaging in a defence.
    The mere theoretical guarantee of the best user rights cannot deliver if the user does not know how to exercise them or if the costs (time and money) are too high.Reference

    Reply
  • 0
    0

    It must be clarified what collective interests mean. Individual companies must not be recognized as “trusted flaggers”.
    To ensure that consumer protection organizations can be recognized as trusted flaggers, they should be added to the list in recital 46.Reference

    Reply
  • 0
    0

    The vzbv welcomes the identity check required in Art. 22. However, it remains unclear whether the information listed are exemplary or all mandatory (see recital 50).
    From the vzbv’s point of view, it is absolutely necessary for commercial users to prove their identity by means of copies of official documents such as IDs, certified bank statements, excerpts from the commercial register, etc.
    Furthermore, it should be made more clear which verification obligations are reasonable for marketplace operators. The vzbv agrees with the view that the check should not be too burdensome. However, a test call to check the phone number, for example, should be reasonable. In addition, the vzbv expressly disagrees that the online marketplaces should be completely released from responsibility if they have verified the traders accordingly.Reference

    Reply
  • 0
    0

    In vzbv’s view it is not enough that online marketplaces should only provide an online interface that enables traders to comply with certain information requirements. It should actually be mandatory for traders to provide this information. If not online marketplaces must prohibit the publication of offers. Furthermore, online marketplaces should carry out regular checks whether the information are plausible or not.
    It should be mandatory for online marketplaces to reveal the identity of the trader to the consumers for direct communication via electronic communication channels in the language of the marketplace.
    Online marketplaces should be required to inform new commercial users about consumer rights prior to admission and require them to comply.
    Finally, online marketplaces that neglect their duty of care should be included in the exemption from liability privileges in Art. 5 (3).Reference

    Reply
  • 0
    0

    The vzbv welcomes these proposals. However, the information must be comprehensible, relevant and concrete. It is therefore regrettable that Art. 24 (c) in particular is characterised by vague legal terms. The vague formulations “meaningful information” and “main parameters” severely limit the provision.
    Particularly in the context of the increasingly emerging problem of “dark patterns”, the current proposal is inadequate.
    In particular, it is not sufficient to rely solely on encouraging platforms and other stakeholders to develop a code of conduct to ensure compliance (in accordance with Article 36).
    Furthermore, the vzbv would like to point out that transparency as such hardly offers any added value for consumers, but must always be embedded in an overall regulatory concept.
    However, such an overall concept within the framework of the DSA is hardly recognisable so far with regard to individualised online advertising. Personalised online advertising – as rightly noted in the recitals – poses significant risks for consumers and society.Reference

    Reply
  • 0
    0

    The catalogue of risks in Art. 26 should take into account risks for the violation of consumer rights: Manipulation, influence and misleading of consumers as well as non-compliance with consumer rights should be explicitly included as a systemic risk in Art. 26.Reference

    Reply
  • 0
    0

    The measures listed are formulated as examples. In the view of the vzbv, this is not enough. Online marketplaces should be obliged to take certain measures to reduce risks. The Product Safety Pledge (already mentioned above) already contains sensible measures: Online marketplaces should set up processes in cooperation with the authorities that aim to proactively remove banned products. In addition, measures should be taken to prevent offers of illegal products that have already been removed from reappearing. In addition, it should be added that VLOPs must also take measures against traders if they (repeatedly) offer illegal products.Reference

    Reply
  • 0
    1

    VLOPs must be obliged to clearly display to consumers the information about which main parameters have an influence on the ranking order, also via the user interface, i.e. directly when displaying the results.
    Furthermore, it should be regulated that parameters based on payments (especially commission payments) or other business relationships or ownership structure between the platform and the trader always represent main parameters in the sense of Art. 29 and must be made transparent. In addition, Art. 29 should also be amended to the effect that consumers must always have a ranking option that is not based on payments or other business relationships/ownership structure between the platform and the seller.
    The omnibus directive (EU) 2019/2161 already provides for more transparency obligation for online marketplaces, not limited to VLOPs. It should be made clear that in future big online marketplaces still have to comply with the provisions of the directive.Reference

    Reply
  • 0
    0

    Art. 51 strengthening the position of the European Commission against VLOPs. However, it is not understandable that there is no obligation for the European Commission to act in case of infringements. Consequently, the EC could be accused of not being consistent in its actions against VLOPs and making purely political decisions. Therefore, it should be specified in the legislative process when the European Commission will take action according to Art. 51ff.Reference

    Reply
  • 0
    0

    It should also be regulated what happens with the content during the review process of the platform and in the context of out-of-court dispute resolution proceedings or legal proceedings (take-down or stay-up during the review process, since at this point it is unclear whether the content is illegal):
    In case serious and clear violations of the law the content should be deleted immediately, if an automated check limited to the essentials (plausibility check) shows that the notice is valid. These particularly serious infringements could include, in particular, content glorifying violence, national socialist statements, cruelty to animals or the offering of products listed in RAPEX.
    In case of other violations of rights a stay-up obligation arises, provided that the content was flagged as lawful and the plausibility check shows that the flagging is not obviously – i.e. easily recognizable for anyone at first glance – incorrect (pre-flagging and delayed takedown). The platform operator shall be exempted from liability for the flagged content until the verification procedure has been completed.
    The risk of abuse that is inherent in both an indication of an infringement and a pre-flagging possibility could be countered by applying the solution found in Art. 20.Reference

    Reply
  • 0
    0

    The vzbv welcomes the possibility that content provider shall be entitled to select out-of-court dispute. However, private users should not have to bear the costs of this procedure. In addition, the anonymity of the procedure should be guaranteed. For example, it should be ensured that the conciliation body may only disclose the personal data of the plaintiff or the user for whom the content complained about was stored with their consent. From the user’s point of view, ensuring low-threshold and, where appropriate, anonymous access to remedies and the absence of cost risks is crucial to actually engaging in a defence.
    The mere theoretical guarantee of the best user rights cannot deliver if the user does not know how to exercise them or if the costs (time and money) are too high.Reference

    Reply
  • 0
    0

    It must be clarified what collective interests mean. Individual companies must not be recognized as “trusted flaggers”.
    To ensure that consumer protection organizations can be recognized as trusted flaggers, they should be added to the list in recital 46.Reference

    Reply
  • 0
    0

    The vzbv welcomes the identity check required in Art. 22. However, it remains unclear whether the information listed are exemplary or all mandatory (see recital 50).
    From the vzbv’s point of view, it is absolutely necessary for commercial users to prove their identity by means of copies of official documents such as IDs, certified bank statements, excerpts from the commercial register, etc.
    Furthermore, it should be made more clear which verification obligations are reasonable for marketplace operators. The vzbv agrees with the view that the check should not be too burdensome. However, a test call to check the phone number, for example, should be reasonable. In addition, the vzbv expressly disagrees that the online marketplaces should be completely released from responsibility if they have verified the traders accordingly.Reference

    Reply
  • 0
    0

    In vzbv’s view it is not enough that online marketplaces should only provide an online interface that enables traders to comply with certain information requirements. It should actually be mandatory for traders to provide this information. If not online marketplaces must prohibit the publication of offers. Furthermore, online marketplaces should carry out regular checks whether the information are plausible or not.
    It should be mandatory for online marketplaces to reveal the identity of the trader to the consumers for direct communication via electronic communication channels in the language of the marketplace.
    Online marketplaces should be required to inform new commercial users about consumer rights prior to admission and require them to comply.
    Finally, online marketplaces that neglect their duty of care should be included in the exemption from liability privileges in Art. 5 (3).Reference

    Reply
  • 0
    0

    The vzbv welcomes these proposals. However, the information must be comprehensible, relevant and concrete. It is therefore regrettable that Art. 24 (c) in particular is characterised by vague legal terms. The vague formulations “meaningful information” and “main parameters” severely limit the provision.
    Particularly in the context of the increasingly emerging problem of “dark patterns”, the current proposal is inadequate.
    In particular, it is not sufficient to rely solely on encouraging platforms and other stakeholders to develop a code of conduct to ensure compliance (in accordance with Article 36).
    Furthermore, the vzbv would like to point out that transparency as such hardly offers any added value for consumers, but must always be embedded in an overall regulatory concept.
    However, such an overall concept within the framework of the DSA is hardly recognisable so far with regard to individualised online advertising. Personalised online advertising – as rightly noted in the recitals – poses significant risks for consumers and society.Reference

    Reply
  • 0
    0

    The catalogue of risks in Art. 26 should take into account risks for the violation of consumer rights: Manipulation, influence and misleading of consumers as well as non-compliance with consumer rights should be explicitly included as a systemic risk in Art. 26.Reference

    Reply
  • 0
    0

    The measures listed are formulated as examples. In the view of the vzbv, this is not enough. Online marketplaces should be obliged to take certain measures to reduce risks. The Product Safety Pledge (already mentioned above) already contains sensible measures: Online marketplaces should set up processes in cooperation with the authorities that aim to proactively remove banned products. In addition, measures should be taken to prevent offers of illegal products that have already been removed from reappearing. In addition, it should be added that VLOPs must also take measures against traders if they (repeatedly) offer illegal products.Reference

    Reply
  • 0
    0

    VLOPs must be obliged to clearly display to consumers the information about which main parameters have an influence on the ranking order, also via the user interface, i.e. directly when displaying the results.
    Furthermore, it should be regulated that parameters based on payments (especially commission payments) or other business relationships or ownership structure between the platform and the trader always represent main parameters in the sense of Art. 29 and must be made transparent. In addition, Art. 29 should also be amended to the effect that consumers must always have a ranking option that is not based on payments or other business relationships/ownership structure between the platform and the seller.
    The omnibus directive (EU) 2019/2161 already provides for more transparency obligation for online marketplaces, not limited to VLOPs. It should be made clear that in future big online marketplaces still have to comply with the provisions of the directive.Reference

    Reply
  • 0
    0

    Art. 51 strengthening the position of the European Commission against VLOPs. However, it is not understandable that there is no obligation for the European Commission to act in case of infringements. Consequently, the EC could be accused of not being consistent in its actions against VLOPs and making purely political decisions. Therefore, it should be specified in the legislative process when the European Commission will take action according to Art. 51ff.Reference

    Reply
  • 0
    0

    Grant a single point of contact to civil society and representative user groups. This helps to formalize connection between existing support structures (e.g. women’s advice centres, LGBTIQA+ support groups) for groups most vulnerable to experience digital violence and who are in need of help though a central points of contactReference

    Reply
  • 0
    0

    Oblige platforms to independent community management that functions and mediate between the content moderation team and the users that feel their content/account has been wrongfully blocked and make them part of the support structure of out of court settlement. The Community management should focus on mediation and help users as well as content moderation teams to solve their case. An independent community management can also be part of helping the user´s rightsReference

    Reply
  • 0
    0

    Inform users/citizens about their rights to report content is part of media literacy and citizen education. Easily accessible language is a step to understand terms and condition and one`s own right in it. Evaluate in as much users / citizens are aware of their rights when it comes to understanding terms and conditions e.g. reporting of content, where to seek for help in case of discrimination, fraud etc.Reference

    Reply
  • 0
    0

    Support research and independent report evaluation that creates a deeper understanding about the way social norms and sanctions are distributed in online communities and enhance self-regulation. To understand how alternative content moderation is conducted helps to prevent the upload of illegal content and prevents discriminationReference

    Reply
  • 0
    0

    Foster research and independent report evaluation that creates a deeper understanding about the way social norms and sanctions are distributed in online communities and enhance self-regulation. To understand how alternative content moderation is conducted helps to prevent the upload of illegal content and prevents discriminationReference

    Reply
  • 0
    0

    Be aware of the growth in user numbers of social media platforms and their development. Until 2025 a growth of about 30 Million internet users is expected in the EU, markets might shift under the DSA. Smaller niche platforms might grow big and will have to fall under the law. Those platforms might suppress their growth in order not to face high penalties. Operational threshold needs to be regularly revised as the DSA in practice will probably cause changes to numbers and markets that are not considered today. Another aspect is the definition of a “proper user” (accounts, logins, unique logins etc.).Reference

    Reply
  • 0
    0

    Make trusted flaggers transparent. Who and how to get the status of trusted flagger (publish criteria). Trusted flagger status should be issued only for a limited amount of time and should be regularly revised.Reference

    Reply
  • 0
    0

    Issue trusted flagger status only for a limited amount of time and revise regularly. There should also be a frequent exchange with moderation teams and Digital Services CoordinatorReference

    Reply
  • 0
    0

    Oblige platforms to make reporting procedures available easily (3-Click-Rule), in a user-centred design and in plain language. Prohibit dark patterns in reporting structures.Reference

    Reply
  • 23
    54

    It should be more transparent in detail what kind of illegal content , e.g. similar to the GIFCT and hash Database (Photo DNA) via an independent agency for research and supervision (e.g. Lumendatabase). This can help to safeguard and control unlawful take-downsReference

    Reply
  • 7
    18

    A good example for this kind of approach concerning illegal content is the LumendatabaseReference

    Reply
  • 9
    18

    Create roundtables and make the platforms exchange on specific cases that interrelate between platforms (cross-platform abuses, specific events) for all platforms – not only those one who are obliged to crisis protocols etcReference

    Reply
  • 2
    0

    • Reporting categories should be “Discrimination of content of body positivity, ableism, queer body concepts” – to create a sensitivity of the body-normative material that is sighted and to give space for deviating aesthetics in automated reporting systems. Here the decisions usually follow a logic of internal reporting categories, which are often limiting to handle the compliant appropriate. E.g. reporting categories could be “Discrimination of content of body positivity, ableism, queer body concepts” – to create a sensitivity of the body-normative material that is sighted (and taken down). Furthermore it would be good to understand how re automated reporting systems involved in these complaint-handling decisions.Reference

    Reply
  • 1
    0

    Audit A.I. based decision making, especially with the aspects of favouring specific, exclusive groups, topics, or individuals. It is not enough to understand that automated means are used to process the notice and to be transparent about it. Here Indicators of the automated decisions need to be cleared up further as well as what kind of safeguards are applied in what wayReference

    Reply
  • 0
    0

    ARTICLE 19 believes that platforms and other tech companies should not be held liably simply because they adopt community standards, and use human moderators or other tools to enforce them. In this sense, we support the adoption of a Good Samaritan rule that would encourage ‘good’ content moderation efforts made in good faith. We therefore support keeping Article 6 in the text of the DSA.

    At the same time, we believe that the text of Article 6 should be clarified. It currently states that intermediary services are not ineligible for the exemptions from liability *solely* because they carry out a range of voluntary measures. This is likely to create legal uncertainty as it suggests that there could be circumstances where carrying out voluntary measures coupled with something else (which is undefined) could lead to a loss of immunity from liability. Either ‘solely’ should be removed or the circumstances in which immunity from liability may be lost in connection with voluntary actions should be clarified.Reference

    Reply
  • 0
    0

    ARTICLE 19 strongly supports this provision as a cornerstone of the protection of freedom of expression and the right to privacy online.Reference

    Reply
  • 0
    0

    Putting an end to the fragmentation can only be archived if national regulatory measures get replaced by a unified framework. But this would require an effective ban of due diligence provisions by national states which go beyond the requirements of the DSA. Otherwise the fragmentation would still persist.Reference

    Reply
  • 0
    0

    It should be clarified that the “knowledge” of information relates to the knowledge of a human being, and not for example an algorithm. It should also be clarified that “control” means taking influence on the material content of the information itself, and not for example only “controlling” how or when it gets displayed.Reference

    Reply
  • 0
    0

    ARTICLE 19 believes that Article 14 (3) should be removed as it would incentivise the removal of content upon mere receipt of a notice of *alleged* illegality submitted by a third party. In practice, only the biggest companies would be able to afford the services of lawyers to help them decide the degree of liability risk they would be exposing themselves to by keeping the content up. In other words, Article 14 (3) would have a significant chilling effect on freedom of expressionReference

    Reply
  • 0
    0

    It is a correct and necessary to require online platforms to provide a notice and action mechanism, because otherwise their liability pursuant to Art. 5 DSA would in the worst case never get triggered. The same may apply to storage services for personal use, like e.g. Dropbox.

    BUT it is ridiculous and unnecessary to require web hosting services in terms of cloud providers to offer a notice and action mechanism. In fact, the liability exception for them is not broad enough, since they are literally just providing a infrastructure like “mere conduit” and “caching” services do.

    The content moderation is, if necessary, already performed by the hosting service for which they provide the technical resources, putting an unnecessary burden on the cloud providers. In particular it unnecessarily incentives them to deny their service to such hosting services which allow problematic but legal content, since these entail the increased risk of becoming liable for the content thereon, for which, in accordance with their business model, cloud services do not have the resources to moderate.Reference

    Reply
  • 0
    0

    ARTICLE 19 is concerned that currently nothing would prevent law enforcement and other public authorities from being awarded trusted flagger status. In our view, public authorities, including law enforcement, should not be able to avail themselves of a status in circumstances where it enables them to bypass judicial procedures or other due process safeguards.Reference

    Reply
  • 0
    0

    Giving priority to notice from trusted flaggers seems reasonable, since their notices are more likely to point to actually illegal content then from the overage flagger, and are also more likely to contain sufficient information to access the legality of the content in question. However, even notices from trusted flaggers can be unfounded and objectively wrong. As such, they should be analyzed with the some scrutiny and as critical as any other notice. In that regard I am missing a clarification here that service operators may not be obliged to give such notices from trusted flaggers a higher degree of trust, and that they should in all cases still be duly evaluated for their correctness.Reference

    Reply
  • 0
    0

    It is commonly accepted that traditional media like newspaper outlets but also publicly financed tv broadcasters must be sufficiently independent from the government. The same must apply to online platforms, which are now de facto controlling the public discourse, even more so as the traditional media. It is good and important that this clarifies that member states must ensure that the digital services coordinators are equally independent from the government. Otherwise the necessary independence from the government of the online platforms would be threatened, given that the DSCs have administrative powers over these online platforms.Reference

    Reply
  • 0
    0

    This chapter is missing some exception for pure infrastructure providers, like cloud providers or DNS services (see my general remarks).
    There should be 3 new articles:

    Cloud
    1. Where an information society service is provided that consists of the provision of technical resources aimed at enabling another information society service to offer its service, the service provider shall not be liable for the activities of the other service provider, including the transmission of information in a communication network or the saving thereof, on the condition that the provider:
    a. does not have actual knowledge of the other service’s primary purpose being the engagement in, or the support of, criminal activity; or
    b. upon obtaining such knowledge, acts expeditiously to cease providing its service to the other service, including the removal or disabling access to illegal content saved on that other service.
    2. Paragraph 1 shall also apply in cases where the same technical resources are provided to a natural person for their personal use only.
    3. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    4. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.

    Contact
    1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service if the sole purpose of that information is to enable others to contact the recipient of the service over a communication network on a technical level.
    2. Paragraph 1 shall also apply in cases where the saved contact information relates to a third party, if that third party has requested the recipient of the service to store that information on the service.
    3. Paragraph 1 shall not apply in respect to additional information stored by the service which is unrelated to contacting the recipient of the service on a technical level. This does not preclude the saving of additional data which is relevant for the contacting of the recipient on a technical level, such as options or preferences relevant for connections and messages.
    4. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    5 This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.

    Auxiliary
    1. Where an information society service is provided that consists of the storage of information provided by another information society services, or the processing thereof, performed for the sole purpose of assisting the other services in providing its service, the service provider shall not be liable for the information saved or processed provided by the other service, on the condition that the provider:
    a) does only modify the information as requested by the other service; and
    b) enables the other service to remove or disable access to the information stored.
    2. Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider.
    3 This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement.Reference

    Reply
  • 0
    0

    This provision is currently incredibly vague and gives wide discretion to the Commission and VLOPs to adopt measures they deem ‘reasonable, proportionate and effective’ to deal with the systemic risks they identify. In practice, digital rights groups often disagree with governments about the measures that ought to be adopted to deal with the dissemination of illegal content. Fundamental rights are also often in conflict. Article 26 currently gives no guidance as to how such conflicts ought to be resolved. ARTICLE 19 also agrees with EPD’s comment above about Article 26 (1) (c): in our view, at the very least, references to ‘civic discourse’ should be removed.Reference

    Reply
  • 0
    0

    A hosting provider should also become liable if it fails to process notices of illegal content as required by Article 14.
    Add the following paragraph:

    5. Paragraph 1 shall not apply where the service does not comply with the requirements set out in Article 14 (1) and (6).

    The above new paragraph should makes paragraph Article 14 (3) unnecessary and it should therefor be removed.Reference

    Reply
  • 0
    0

    It should be clarified, either here or in a the recitals, that actual knowledge must also include the illegality of a specific content or activity. More precisely, a service operator can only be held liable if it knows the content in question and had no reason to assume it to be legal. This is the interpretation of this term by most courts so far, and is vital to establish a proper balancing with the right of free speech.Reference

    Reply
  • 0
    0

    At the very least, Article 27 (or better still Article 7) should make clear that weakening encryption is never a reasonable, proportionate and effective mitigation measure.Reference

    Reply
  • 0
    0

    This clarification alone is not enough to ensure service operators can securely perform own-initiative investigations. There needs to be an expressive rule, which ensures service operators are not required to perform a full assessment of the legality of all content which they become aware of during such investigations, because this would slow down such investigations and/or bear significant risks of liability, disincentivizing such own-initiative measures.
    Add the following new paragraph:

    (2) In the case that a provider of a hosting service as defined in Article 5 becomes aware of specific information saved on their service by a recipient of that service through a voluntary own-initiative investigation, the service provider shall not be liable for that information on condition that the provider:
    (a) does not have actual knowledge of the content’s or activity’s illegality; or
    (b) upon obtaining such knowledge, acts expeditiously to remove or to disable access to the illegal content.Reference

    Reply
  • 0
    0

    Pure infrastructure providers like e.g. cloud providers should be obligated to respect to net neutrality requirement as access providers currently have to. All other service operators are dependent on them, and it is unacceptable that they prevent their recipients from using their service for legal purposes, where they do not think that such a use is “in good taste”. On the other hand, some smaller such providers might be operated solely to support a certain group of people from the beginning, which should still be allowed.
    There should be two new paragraphs added:

    3. Providers of intermediary services which enable other information society services to operate may not exclude in their terms and conditions the use of their service for services which are not in breach of any applicable union or national law. All services operating in compliance will all applicable laws must be capable of using their services.
    4. Paragraph 3 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC.Reference

    Reply
  • 0
    0

    This is ridiculous and unacceptable. As this reads, a hosting service becomes automatically liable for content for which it has received a notice, irrespective if he can conclude (with certainty) the illegality of the content in question. This would lead to massive and unjustifiable overblocking.

    This paragraph should be removed and instead there should be a new paragraph in article 5, which makes hosting services liable if they do not provide or do not process notices as mandated by article 14. (see also my remarks on article 5).

    At the very least, this should be rewritten in a such a way that a service operator should only be considered to attain the knowledge of the information contained in the notice. Whether or not this gives rise to the actual knowledge or awareness for the purpose of Article 5 should be evaluated on a case-by-case basis.Reference

    Reply
  • 0
    0

    Such orders should also contain a statement if the service provider is allowed to inform the recipient(s) to which the information belongs about the order including the redress possibilities immediately, and if not, when they are allowed to do so.

    Otherwise there is the risk of confusion. Service operators might retain from informing the recipient in cases this is unnecessary, unduly limiting their access to redress possibilities, or they might inform the recipient in cases where they are not meant to, possibly endangering a criminal investigation.Reference

    Reply
  • 0
    0

    First of all this paragraph is missing a liability exception for the service operator if it re-enables the access to content it has considered to be illegal after the body decides it to be legal, which are binding decision as the paragraph currently specifies. Without such a liability exception a service would risk massive claims in damages, if the decision of the body turns out to be wrong and the content is indeed illegal.

    In fact, the first decision to remove it would provide sufficient ground to prove the actual knowledge as used in Article 5, triggering direct liability.

    Secondly there is no reason why only the recipient can seek redress against the body’s decision but not the service. Both should be allowed to go to the courts if they consider this to be necessary.Reference

    Reply
  • 0
    54

    Obliging service operators to pay the expenses for the out of court dispute in all cases where their own decision gets overturned is disproportionate.

    In respect of illegal content service operators are facing the problem of being obliged to same degree to anticipate the interpretation of the law in all relevant cases, even in such cases where there are no relevant decisions by the courts. In fact most laws are written in an abstract way to account for all cases, including these which are yet unknown.

    In the interest of the protection of all fundamental rights it is necessary to not only remove content whose illegality has been definitely confirmed, but also content which is likely to be illegal in the sense of a foreseeable interpretation of the law.

    If a service operator decides in good faith to interpret the law so that a certain type of content becomes illegal, but the out-of-court body interprets the law even a little bit more lax then the service, service operators will have to either allow content which they consider illegal, putting them at immediate risk of later liability given their prior official statement of such a interpretation and the naturally entailed confirmed actual knowledge of its illegality, or face a high number of out-of-court proceedings, which will foreseeable end with them having to pay a possible high enough amount of fees and expenses which endanger the profitability of their business.

    In that regard it would not even be enough to also let the recipients to pay the expenses of a service if the body confirms the services decision.

    In the interest of all parties I would suggest to make such out-of-court bodies available free of charge for both parties, for example by ensuring that they are publicly financed.Reference

    Reply
  • 8
    18

    As far as I understand this (and the next) article, it binds member states to certain requirements concerning orders IF they want to implement them in their national law, but does not mandate them. Furthermore it leaves it up to the member states to decide when and how such orders can be issued by a competent authority, and in particular gives them full freedom on the conditions for how and when they become binding, whether or not they can be challenged in court, and if they are binding until a court overturns them.

    But these conditions are, beside the informational minimum requirements currently set out, the most important factors for such orders. As such they should be harmonized around the single digital market.

    At first, at a bare minimum, all member states should be obligated to provide judicial redress possibilities for both the service provider as well as the affected user. This is technically already a requirement by basic right state principals, but expressively including such a requirement would be best.

    The DSA should also either provide clear timelines in which such orders must be answered, or at least provide for maximum and minimum time windows which must be given for a service operator to answer to an order, while letting member states decide for themselves which time window they choose for their orders (e.g. at least 1h but no more then 24h, or 7 days for MSEs). Service operators should be obliged to either accept and fulfill the order, or to contest it within the given time window.

    The DSA should further provide that any such order does not become legally binding with respect to the service operator until it has either accepted the order or there is no more redress available for the service provider. I think that this requirement is by far the most important here in terms of the necessary protection of free speech, and as such should be made binding for all member states.

    First of all one has to consider the abuse potential for removal orders, which may not only be given out by independent courts, but also by the executive branch. If such orders would always be binding until a court has declared them as unfounded (e.g. due to the legality of the content in question), the executive branch would have the possibility to continuously issue unfounded removal orders in order to enforce their different interpretation of the law, effectively bypassing the judicial oversight and violating one of the most basic principals of the separation of power. After all the service operators would be required to comply even with obviously unfounded removal orders until a court has overturned them.

    Secondly one has to consider that in obvious cases of illegal content – which includes by nature the most serious ones, like death threats, CSAM or terrorist propaganda – the order itself gives most of the time rise to actual knowledge of the illegal content, making the service operator directly liable for the content in questions in cases of well-founded removal orders. Given so they will only ever contest removal orders which are either obviously unfounded, or in difficult corner edge cases where the protection of free speech is especially important.

    Lastly it has to be considered that such a contest automatically leads to a judicial proceeding, which requires a non-negligible amount of financial resources, which may be fully lost if the order gets confirmed by the courts. As such service operators are highly distinctivized to contest such removal orders and will only do so if they see a high probability of winning the case.

    Taking these considerations into account, there is literally no valid reasons for making removal orders automatically binding. In obvious cases of illegal content they become already de-facto binding by the regular liability regime, and in all other cases service operators are highly unlikely to not accept them unless the relevant content is likely protected by free speech.

    An additional option would be to add the requirement that removal orders may only be issued by an independent court which verifies the legitimacy of the removal order before it gets issued.Reference

    Reply
  • 2
    0

    It is not clear to me if the notices must contain all the elements listed, or if it is ok if user leaves out any of the elements (e.g. the name or the statement of good faith). Especially the wording of (c) suggest that service operators may deny the submission of the notice if it does not contain the fields in question.

    I see no reason for such a requirement. Especially with respect to manifestly illegal content, there is no need for any of these elements beside the exact URL to the illegal content. It should be made clear that a service operator may solely deny the submission of notices which do not contain the element (b). All other fields should not be mandatory, especially the name field.Reference

    Reply
  • 1
    0

    I see no reason why a user should be required to provide its name. Providing the e-mail address should be sufficient here.Reference

    Reply
  • 1
    0

    I see no reason why the reasons should only be given if content got removed, but not if it was left online.

    This requirement seems also to be disproportionate for a range of hosting providers, especially such which are operated for personal uses only. They should be allowed to arbitrarily remove content or expel users from their service without an explanation.

    Furthermore there are many hosting services which allow users to upload content without an account, and therefor have no possibility to inform the user of the reasons of a removal, or even the removal itself.

    I would therefor suggest excluding MSEs from this requirement.Reference

    Reply
  • 1
    0

    This can put an unreasonable burden on MSEs. Many smaller hosters are using third party software which does not provide the interface for such a submission, and they do not have the expertise to implement one nor the financial resources to hire someone to do it. In fact many such sites are run by a private budget only. As such MSEs should be excluded from this requirement.

    In fact I see no reason why any other hosting services but online platforms should provide their reasoning publicly. Only on such services removal decisions have a notable impact on free speech, and only there exists a sufficient need for public oversight to justify such a requirement. I would therefor suggest making this requirement only binding to onlome platforms, again excluding MSEs for reasons of proportionality.Reference

    Reply
  • 0
    0

    6 months seems like an unreasonable long time span. 1 or maybe 2 months is more then enough in the vast majority of cases, if not all.

    Furthermore I see no reason why only a decision of removal/suspension can be complained against, but not a decision not to remove content, which can also be unjustified.Reference

    Reply
  • 0
    0

    This is an important requirement. It should also be mandated that the person(s) which are handling the complaint are not the same as the one(s) which have made the contested decision. Otherwise the handling will naturally be biased and thus not objective.Reference

    Reply
  • 0
    0

    This paragraph is missing an explicit requirement to have due regard of the fundamental rights of the recipients of the service. In fact I think that any impairment of the exercise of fundamental rights is disproportionate here. given that the primary purpose of any such risk mitigation measures is to protect the exercise of fundamental rights.
    The following paragraph should be inserted:

    2. The measures referenced by paragraph 1 shall not impair the exercise of fundamental rights by ordinary persons.

    The restrictions to ordinary persons is optional and intended to clarify that intentional manipulations by the service, especially by bots, should not be covered by this guarantee even if they solely spread legal content.Reference

    Reply
  • 0
    0

    It is ridiculous to demand the DSCs to be politically independent while granting the highest powers to the Commission, a de facto political organization. At the very least it can not be disputed that the Commission is NOT politically independent. Given that the Board consists of the the DSCs and is therefor constructed as politically independent, it is solely the Board who should have the power to open any proceedings and determine if an infringement is taking place.

    The Commission should therefor be bound to the decisions of the Board. It is acceptable for them to organize the proceedings for as long as the power remains with the Board.Reference

    Reply
  • 0
    0

    Again, such powers must only be given to a politically independent party. Given so such orders should only be given out by the Board. Taking into account the possibility of abuse or a disproportionate impairment of the exercise of fundamental rights by such interim measures, it would be ok if the Commission would be granted a veto right for such orders.Reference

    Reply
  • 0
    0

    Sounds good and reasonable in theory, but fails to provide the intended effect in practice. Not loosing the liability exception in general does not mean a service operator might loose it in specific cases. Voluntary own-initiative investigations can lead an operator to become “aware of facts or circumstances from which the illegal activity or illegal content is apparent” while not attaining the “actual knowledge” of its illegality. Pursuant to Art. 5 (1) an operator would still loose it immunity in such a case, as least as regards to claims for damages.

    This rule was established to ensure that service operators perform a duly assessment of the legality of content of which they get notified. Such an assessment can take a reasonable amount of time, and does usually require looking up the relevant laws.

    In the context of own-initiative investigations this rule is discouraging, because liability is already triggered if the operator would have come to the conclusion of illegality after a proper legal assessment, irrespective of whether such an assessment is intended or even reasonable.

    This makes own-initiative investigations on a best effort basis impossible, because an operator risks loosing its immunity if it instructs its moderators to only look for manifestly illegal content while ignoring all problematic but, based on a first sight, not necessarily illegal content for maximum efficiency in tackling the most serious illegal content.

    I therefor suggest creating a provision which ensures that a service operator does, in the context of voluntary own-initiative investigations, only loose its immunity from liability if its has attained the actual knowledge of illegality, giving operators full freedom to decide on their own if and if so how duly they perform a legal assessment of the information they become aware of during such investigations.Reference

    Reply
  • 0
    0

    Law Enforcement Authorities should not be considered “trusted” flaggers. We have similar concerns about granting such status to commercial industry representatives.Reference

    Reply
  • 0
    0

    We suggest including a para 3, stating that providers of intermediary services must enable the anonymous use of their services, except where this is technically or legally infeasible. Deviating terms of service requiring users to use their real names should be considered unlawful. Alternative: Amending the proposal by adding a new provision on online anonymity.Reference

    Reply
  • 0
    0

    We are concerned that the DSA proposal may follow the footsteps of recent disastrous Internet bills, which ignore the fundamental rights impact at a global level by targeting services regardless of their place of establishment. In line with the EC’s impact assessment, the EP should add an exception for small and non-professional providers of services.Reference

    Reply
  • 0
    0

    A “notice-equals-knowledge” approach will lead to overblocking and harm for marginalized groups. It should be rejected by the Parliament: Under current rules, host providers only benefit from limited liability for third-party content when they expeditiously remove content they “know” to be illegal. By attaching knowledge to substantiated notices, platforms will have no other choice than to block content to escape the liability threat. Even worse, platforms are explicitly incentivized to use “automated means” (para 6), which undermines the ban of mandated monitoring obligations.

    The Parliament should thus delete Art 14(3)(6) and redesign the provision. It is the wrong approach to put the responsibility on the shoulders of users (who are supposed to know why content is illegal) and to appoint platform moderators online judges (who are supposed to have the legal qualification to decide whether content should be removed). The Parliament should propose rules that 1) acknowledge that mistakes happen on both user and platform-side and 2) guarantee independent judicial oversight.Reference

    Reply
  • 1
    2

    Platforms are required to mitigate risks that occur as a result of their terms of service enforcement, user notifications, and measures allowing platforms to identify potential illegal content. Combined with the provisions on sanctions, such a model would create a strong incentive to proactively avoid the occurrence of risks, using measures that cannot be mandated by the letter of the law (such as automated systems), but are the most cost-effective way to comply with DSA obligations. In this respect, the non-defined requirements of acting “reasonable, proportionate and effective” do not give much guidance on how platforms should act in practice and fall short of human rights standards and related concerns about what is necessary in a democratic society.

    We suggest that the EP improves this provision by 1) including Recital 58 in the operative part of the text 2) making sure that mitigation measures do not undermine fundamental rights, including freedom of expression and do not undermine the ban of mandated monitoring obligations 3) following int human rights principles, e.g. it should be added that measures must be necessary (least restrictive to achieve desired objective).Reference

    Reply
  • 0
    0

    If the EU wants to end the online dominance of a few powerful platforms, we need rules that enable users to communicate with friends across platform boundaries.

    The EU Parliament should include an Article 33a with a general interoperability mandate for very large online platforms. Such platforms should offer possibilities for competing smaller providers to interoperate with their core features. https://www.eff.org/deeplinks/2020/06/our-eu-policy-principles-interoperabilityReference

    Reply
  • 0
    0

    For the question of unfounded notices, the EU Parliament should make a difference between repeated, often automated notices submitted by entities or persons with specific expertise related to the content in question versus complaints by individual users, who will often not possess such knowledge and will not be a position to understand the consequences of misjudging content as illegal. This is particularly true in the realm of intellectual property rights, as profit-seeking organisations of industry and of rightsholders can be awarded the status of a trusted flagger under Article 19 (which should not be the case). The EU Parliament should also clarify if third parties other than online platforms can refer to the Digital Services Coordinator if trusted flaggers misuse the notice and action system.Reference

    Reply
  • 0
    0

    The proposed enforcement structure relying on national regulators (including Digital Services Coordinators (DSC)) and the European Commission harbors the risk of weak and inconsistent enforcement. In some member states, regulators might be capable and willing to do this, and might have already taken up some of these tasks. In other countries, regulators might be willing, but need more time to secure funding and build up necessary expertise, or others may not be willing at all. As a result, EU-wide rules from the DSA could be unevenly enforced. This would considerably weaken the DSA’s impact and repeat some of the issues plaguing another landmark piece of EU legislation, the General Data Protection Regulation (GDPR). European data protection rules grant citizens important rights and impose duties on companies processing personal data, including big online platforms. Yet, the GDPR’s potential positive effects are seriously hampered by different levels of enforcement across EU member states.

    Policymakers in the member states and at the EU level should overhaul their current enforcement plans and build a dedicated European-level agency charged with enforcing the DSA’s new due diligence rules. Transparency reporting, explanations for recommender systems and audits are matters of EU-wide concern and apply to tech companies with an EU-wide reach, so they should be overseen by an EU body. This body should focus specifically on platforms offering citizens digital spaces where they exchange views with one another, consume and share news, and receive and send political messages. This would include search engines, social media sites, video platforms and messenger services with public, social networking functions. Such platforms are important enough and different enough from other platforms and other indus-tries that they require their own specific oversight regime. Their design and business model also carry specific individual and societal risks, such as amplifying disinfor-mation, algorithm bias and privacy concerns, that necessitate their own oversight. The DSA provides the rulebook for addressing these risks, but not the right mech-anisms to enforce the rulebook. Instead of relying on at least 27 different national regulators, the Commission and a new European advisory board, the EU should build a single European Digital Services Coordinator t o deal with social networks and search engines.

    The suggested approach of giving European regulatory networks (more) enforcement powers faces challenges, too. The ERGA especially seems to be keen on enhancing its (members’) powers. Yet, resources are unevenly spread among these bodies, which might create difficulties for properly coordinating an EU-wide enforcement of the DSA via such networks. For instance, the ERGA only has a secretariat located within the Commission, whereas the BEREC has a coordination level established as an official EU agency. Building up expertise and reshuffling departments at existing agencies and networks is not guaranteed to be easier than designing an agency from scratch. More crucially, by relying only on these bodies without developing a European platform agency, the DSA’s enforcement will continue to be fragmented across policy fields and member states.

    A more detailed analysis on this can be found in this SNV policy brief: https://www.stiftung-nv.de/sites/default/files/snv_dsa_oversight.pdfReference

    Reply
  • 0
    0

    The proposed enforcement structure relying on national regulators (including Digital Services Coordinators (DSC)) and the European Commission harbors the risk of weak and inconsistent enforcement. In some member states, regulators might be capable and willing to do this, and might have already taken up some of these tasks. In other countries, regulators might be willing, but need more time to secure funding and build up necessary expertise, or others may not be willing at all. As a result, EU-wide rules from the DSA could be unevenly enforced. This would considerably weaken the DSA’s impact and repeat some of the issues plaguing another landmark piece of EU legislation, the General Data Protection Regulation (GDPR). European data protection rules grant citizens important rights and impose duties on companies processing personal data, including big online platforms. Yet, the GDPR’s potential positive effects are seriously hampered by different levels of enforcement across EU member states.

    Policymakers in the member states and at the EU level should overhaul their current enforcement plans and build a dedicated European-level agency charged with enforcing the DSA’s new due diligence rules. Transparency reporting, explanations for recommender systems and audits are matters of EU-wide concern and apply to tech companies with an EU-wide reach, so they should be overseen by an EU body. This body should focus specifically on platforms offering citizens digital spaces where they exchange views with one another, consume and share news, and receive and send political messages. This would include search engines, social media sites, video platforms and messenger services with public, social networking functions. Such platforms are important enough and different enough from other platforms and other industries that they require their own specific oversight regime. Their design and business model also carry specific individual and societal risks, such as amplifying disinformation, algorithm bias and privacy concerns, that necessitate their own oversight. The DSA provides the rulebook for addressing these risks, but not the right mechanisms to enforce the rulebook. Instead of relying on at least 27 different national regulators, the Commission and a new European advisory board, the EU should build a single European Digital Services Coordinator to deal with social networks and search engines.

    The suggested approach of giving European regulatory networks (more) enforcement powers faces challenges, too. The ERGA especially seems to be keen on enhancing its (members’) powers. Yet, resources are unevenly spread among these bodies, which might create difficulties for properly coordinating an EU-wide enforcement of the DSA via such networks. For instance, the ERGA only has a secretariat located within the Commission, whereas the BEREC has a coordination level established as an official EU agency. Building up expertise and reshuffling departments at existing agencies and networks is not guaranteed to be easier than designing an agency from scratch. More crucially, by relying only on these bodies without developing a European platform agency, the DSA’s enforcement will continue to be fragmented across policy fields and member states.

    A more detailed analysis on this can be found in this SNV policy brief: https://www.stiftung-nv.de/sites/default/files/snv_dsa_oversight.pdfReference

    Reply