„NOTICE AND ACTION“-MECHANISMEN

Die Grünen/EFA präsentieren hier den endgültigen Gesetzesvorschlag für einen Notice-and-Action-Mechanismus und Inhalte-Moderation von Online-Plattformen für den kommenden Digital Services Act (DSA). Es war keine leichte Aufgabe, wenn man bedenkt, dass es derzeit noch keine klaren Regeln für Online-Plattformen wie YouTube, Twitter, Facebook oder Instagram & Co. gibt und wie sie mit Nutzerbeschwerden, Meldungen über illegale Inhalte oder mit Inhalten unter ihren AGBs umgehen sollen.

Der DSA ist die Gelegenheit, ein besseres Internet aufzubauen, das demokratisch und sicher ist. Wir müssen sicherstellen, dass illegale Inhalte vom Netz genommen werden, während unsere Meinungsfreiheit geschützt wird. Mit diesem Modellgesetz schlagen die Grünen/EFA den ersten umfassenden Rahmen für einen EU-weiten Mechanismus vor, der heute der EU-Kommission übergeben wurde und der als Grundlage für unsere gesetzgeberische Arbeit zum DSA dienen wird.

Den Textentwurf konnte die Öffentlichkeit vom 1. Oktober bis zum 1. November kommentieren. Im vergangenen Monat haben wir an einer aktualisierten Fassung gearbeitet, in der alle eingegangenen Kommentare (so gut wir konnten) umgesetzt wurden. 

Wir, die Fraktion der Grünen/EFA, sind der festen Überzeugung, dass eine offene Zusammenarbeit und ein partizipativer Prozess den politischen Entscheidungsträgern helfen kann, bessere Gesetze zu schreiben und die jüngsten Denkprozesse von NGOs, Akademikern, der Industrie und international anerkannten Experten widerzuspiegeln. Auch wenn wir nicht alle Antworten haben, bietet unser Gesetzesentwurf eine erste Reihe geeigneter Lösungen. 

Wir haben Beiträge aus der Zivilgesellschaft, Industrie, Wissenschaft und von international anerkannten Experten erhalten, denen wir nicht genug für all ihre wertvollen Beiträge danken können:

NGOs

AccessNow,  Article 19, AWO, EFF, HateAid, ILGA, NOYB, Wikimedia.

Industrie

DOT Europe (formally EDiMA)

Einzelne Experten und Akademiker

Martin Husovec, Daphne Keller, Aleksandra Kuczerawy, Joe McNamee, Ignacio-Wenley Palacios Iglesias, Ben Wagner.

Download
endgültigen GesetzesvorschlaG

Regulation on procedures for notifying and acting on illegal content and for content moderation under terms and conditions by information society services

(“Regulation on a Notice and Action mechanism and content moderation procedures”)

Chapter I

General Provisions

Article 1
Subject matter, objectives and scope

  1. This Regulation lays down rules to establish procedures for notifying and acting on illegal content and on content in violation of terms and conditions hosted by information society services and to ensure their proper functioning. Non-commercial services or services that have fewer than 100.000 monthly active users shall be exempt from this Regulation.
  2. This Regulation seeks to contribute to the proper functioning of the internal market by ensuring the free movement of intermediary information society services in full respect of Regulation 2016/679, Directive 2002/58/EC, and the Charter on Fundamental Rights, in particular its Article 11 on the right to freedom of expression and information, Article 7 on the right to privacy, Article 16 on the freedom to conduct a business and Article 47 on the right to an effective remedy and a fair trial.
  3. This Regulation applies to information society services targeting the Union market, irrespective of the place of establishment or registration, or principal place of business of the information society service.

Article 2
Definitions

For the purpose of the Regulation:
  1. ‘information society service’ means any service provided by means of electronic equipment for the processing and storage of data, and at the individual request of a recipient of a service.
  2. ‘hosting service provider’ means any information society service provider that provides a service consisting of the storage of information provided by and at the request of a recipient of the service, and in making information stored available to third parties. 
  3. ‘content’ means any information in any format such as text, images, audio and video.
  4. ‘allegedly illegal content’ refers to content being subject to allegations of illegality.
  5. ‘manifestly illegal content’ refers to content that is unmistakably and without requiring review by a lawyer in breach of legal provisions regulating the legality of content on the internet in a Member State. 
  6. ‘content moderation’ means the practice of sorting, disabling access to, removing or demoting content provided by a recipient of a service by applying a pre-determined set of rules and guidelines with the goal to comply with its terms and conditions, as well as any equivalent measure taken by the hosting service provider.
  7. ‘content provider’ refers to a user who has provided content that is, or that has been, stored at their request by a hosting service provider.
  8. ‘competent authority’ refers to an independent judicial authority which is established by a Member State.
  9. ‘supervisory authority’ means an independent public authority established by law in a Member State.
  10. ‘notice’ means a valid communication pursuant to Article 4 about allegedly illegal content with the objective of making a hosting service provider aware about the presence of said content.
  11. ‘notifier’ means any natural or legal person submitting a notice.
  12. ‘counter notice’ means a notice through which the content provider challenges a notice as regards content provided by the said provider.
  13. ‘terms and conditions’ means all terms, conditions, codes of ethics or rules, irrespective of their name or form, which govern the contractual relationship between the content hosting platform and its users and which are unilaterally determined by the hosting service provider.
  14. ‘actual knowledge’ means knowledge of an information society service of the unlawful nature of a specific piece of content present on its systems, obtained by means of a court order or a court decision following due process of law.
  15. ‘European  standard’  means  a  European  standard  as  defined  in  point  (1)(b)  of  Article  2  of  Regulation on European Standardisation  (EU) No 1025/2012.

Chapter II

Notifying Allegedly Illegal Content

Article 3
The right to notify

  1. The hosting service provider shall establish an easily, clear and directly accessible mechanism that allows natural or legal persons to notify the hosting service provider regarding allegedly illegal content. Such mechanism shall allow for notification by electronic means and be as easy to use as notifications of content under the terms and conditions of the hosting service provider. 
  2. The hosting provider should provide user-friendly information on how to report manifestly illegal content directly to law enforcement authorities.
  3. The European Commission shall, by means of delegated acts pursuant to Article 28, determine a common standard for the structure of the notification mechanism.

Article 4
Standards for notices

  1. Hosting service providers shall ensure that the mechanism described in Chapter III allows notifiers to submit notices which are sufficiently precise and adequately substantiated to inform the hosting service provider about the presence of allegedly illegal content. Valid notices shall at least contain the following elements:
    1. an explanation of the grounds for the claim that the content is illegal, accompanied with the legal basis, where possible, and an indication whether the content is believed to be manifestly illegal or possibly illegal.
    2. proof that substantiates the claim, where possible;
    3. a clear indication of the exact location of the allegedly illegal content (URL and timestamp where appropriate);
    4. a declaration of good faith, in the course of intellectual property right infringement disputes,  that the information provided is accurate;
    5. information, in the course of intellectual property right infringement disputes, concerning the fact that the notifier has in vain submitted its request to the content provider or that the content provider could not be identified.
  2. Notifiers shall be given the choice whether to include their contact details in the notice. Where notifiers decide to include their contact details, their anonymity towards the content provider shall be ensured, except in cases of alleged violations of personality rights or of intellectual property rights. Such contact details may be used by the hosting service provider to request missing information or to send information pursuant to Article 6(2). 
  3. Notices shall not automatically trigger legal liability nor should they impose any removal requirement for specific pieces of the content as they do establish actual knowledge.

Chapter III

Acting on Allegedly Illegal Content

Article 5
Scope of application of the notice and action mechanism

The provisions in Articles 3 and 4 shall only apply to allegedly illegal content. Content moderation practices under the terms and conditions of hosting providers are excluded from the scope of the notice and action mechanism. 

Article 6
Standardised communication channels

  1. Competent authorities of Member States shall issue removal orders and supporting documentation for hosting service providers via trusted communication channels established on the basis of a European standard and the templates mentioned in Annex I-III. 
  2. The Commission shall, by means of implementing acts pursuant to Article 291 of the Treaty on the Functioning of the European Union (TFEU), define the European standard for the introduction of standardised Application Programming Interfaces (APIs) for trusted communication channels with hosting service providers.

Article 7
Acting on illegal content notified by notifiers

  1. Upon notice from a notifier, the hosting service provider shall act expeditiously to disable access to content which is manifestly illegal. 
  2. Content that has been the subject of a notice and that is not manifestly illegal shall remain accessible while the assessment of its legality by the competent authority is still pending, without prejudice to judicial orders regarding content online. Member States shall ensure that the hosting service provider is not held liable while the assessment of legality is still pending.
  3. Upon notice from a notifier, the hosting service provider shall act expeditiously to transmit a copy of the notice for decision in accordance with Article 8 to the competent authority of the Member State where the hosting service provider has its main establishment, or, if it is not established in the Union, where it is legally represented in order to request an assessment of the content, provided that requirements under Article 4 are fulfilled. The decision by competent authority shall be taken within 7 working days.
  4. Upon receipt of the removal order from the competent authority, the hosting service provider shall expeditiously remove illegal content. The removal becomes final where it has not been appealed within the deadline according to the applicable national law or where it has been confirmed following an appeal.
  5. In all cases, the final decision shall be undertaken by qualified staff, to whom access to psychological assistance is effectively provided.

Article 8
Acting on illegal content notified by competent authorities

  1. The competent authority of the Member State where the hosting service provider has its main establishment, or, if it is not established in the Union, where it is legally represented, shall have the power to issue a removal order requiring the hosting service provider to remove illegal content or disable access to it in all Member States, as provided by law.
  2. The competent authority of a Member State where the hosting service provider does not have its main establishment or does not have a legal representative may request access to be disabled to illegal content and enforce this request within its own territory.
  3. If the relevant competent authority has not previously issued a removal order to a hosting service provider it shall contact the hosting service provider, providing information on procedures and applicable deadlines at least 12 hours before issuing a removal order. Hosting service providers shall remove illegal content as soon as possible and no later than 24 hours after the receipt of a removal order, unless where technical or operational reasons prevent it from complying within the deadline, in which case it shall provide exhaustive justification to the competent authority. It shall execute the removal as soon as the reasons have ceased to exist without undue delay. 
  4. Removal orders shall be transmitted to the hosting service provider within 7 working days of receiving a copy according to Article 7 (3) and shall contain the following elements in accordance with the template set out in Annex I:
    1. identification of the competent authority via an electronic signature issuing the removal order and authentication of the removal order by the competent authority; 
    2. a detailed statement of reasons explaining why the content is considered manifestly illegal content or illegal content, and a specific reference to the categories of content;
    3. an exact location of the content (URL and timestamp where appropriate), and, where necessary, additional information enabling the identification of the content referred; 
    4. a reference to this Regulation as the legal basis for the removal order;
    5. date and time stamp of issuing;
    6. easily understandable information about redress with the competent authority available to the hosting service provider and to the content provider, as well as deadlines for appeal; 
    7. where necessary and proportionate, the decision not to disclose information about the removal of illegal content, accompanied by the legal basis of this decision and means of appeal. 
  5. The competent authority shall address removal orders to the main establishment of the hosting service provider or to the legal representative designated by the hosting service provider, and transmit it to the point of contact of the hosting service provider. Such orders shall be sent by electronic means capable of producing a written record under conditions allowing to establish the authentication of the sender, including the accuracy of the date and the time of sending and receipt of the order. 
  6. Hosting service providers shall, without undue delay, inform the competent authority about the disabling access to manifestly illegal content indicating, in particular, the time of action, using the template set out in Annex II.
  7. If the hosting service provider cannot comply with the removal order because of force majeure or of de facto impossibility not attributable to the hosting service provider, including for technical or operational reasons, it shall inform, without undue delay, the competent authority, explaining the reasons, using the template set out in Annex III. The deadline set out in paragraph 3 shall apply as soon as the reasons invoked are no longer present. 
  8. The hosting service provider may refuse to execute a removal order if the removal order contains manifest errors or does not contain sufficient information. It shall inform the competent authority without undue delay, asking for the necessary clarification, using the template set out in Annex III. The deadline set out in paragraph 2 shall apply as soon as the clarification is provided. 
  9. The competent authority that issued the removal order shall inform the supervisory authority that oversees the implementation of specific measures when the removal order becomes final. A removal order becomes final where it has not been appealed within the deadline according to the applicable national law or where it has been confirmed following an appeal.

Article 9
The right of a notifier to be informed

  1. Upon receipt of a notice, the hosting service provider shall immediately send or display a confirmation of receipt of the notice and a copy of the information provided under Article 4 to the notifier.
  2. Upon the decision to act or not on the basis of a notice, the hosting service provider shall inform the notifier accordingly, if contact information is available.

Article 10
The right of a content provider to be informed

  1. Upon receipt of a notice or of a removal order, the hosting service provider shall promptly inform the content provider about the notice, about the possibility to issue a counter-notice, how to complain with the hosting service provider and how to appeal a decision by either party with the competent authority.
  2. The first paragraph of this Article shall only apply if the content provider has supplied sufficient contact details to the hosting service provider and respect the provisions in Article 4(2).
  3. Where requested by the competent authority, the first paragraph shall not apply in order to not obstruct the prevention and prosecution of serious criminal offences. In this case, the information should be provided as soon as the reasons not to inform cease to exist or an investigation has concluded.

Article 11
The right to counter notice

  1. Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been notified to submit a counter notice before any action of the hosting service provider, as well as after disabling access to manifestly illegal content, requesting that said content remains accessible or is restored.
  2. The hosting service provider shall promptly assess the counter-notice and take due account of it.
  3. If the counter-notice provides reasonable grounds to consider that the content is not manifestly illegal, hosting service providers may decide not to act on the basis of a notice or to restore, without undue delay, the content to which access was disabled, and inform the content provider and notice provider of such restoration. The hosting service provider shall ensure that the provider of the counter notice and the provider of the initial notice are informed about the follow-up to a counter notice, where their contact details were provided.
  4. Paragraphs 1 to 3 of this Article are without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.

Article 12
Sanctions

  1. Member States shall lay down rules on administrative sanctions for hosting service providers for systemic violation of their obligations provided by Chapter II and Chapter III, national supervisory authorities shall be competent to impose administrative fines. Such sanctions shall be effective, proportionate and dissuasive.
  2. Member States shall lay down rules on administrative sanctions for notifiers who systematically and repeatedly issue vexatious and abusive notices in contravention of Article 4(1)d, national supervisory authorities shall be competent to impose administrative fines. Such sanctions shall be effective, proportionate and dissuasive.

Chapter IV

Acting on Violations of Terms and Conditions

Article 13
Scope

  1. This chapter applies to hosting service providers’ actions regarding violations of terms and conditions and content moderation practices upon its own initiative or upon a notice under the terms and conditions.
  2. Any action under this Chapter shall be suspended if a notice is received in accordance with Article 7 or a removal order is issued in accordance with Article 8. The action may be resumed if the content is restored following the applicable procedures.

Article 14
Minimum standards for terms and conditions

  1. Without prejudice to the Consumer Rights Directive and the Unfair Commercial Practices Directive, where hosting service providers adopt terms and conditions on content moderation specifying the rights of users, other standards and practices for content moderation they shall be publicly available in clear, plain language and accessible formats. 
  2. Terms and conditions on content moderation as well as content moderation practices shall be fair, accessible, predictable, non-discriminatory, transparent and in compliance with the Charter of Fundamental Rights and international human rights standards.
  3. The hosting service provider shall inform users in advance of all changes in relevant policies regarding its terms and conditions on content moderation as applicable and without delay, and in formats that they can easily access and understand, including explanatory guides.
  4. When operating in several Member States, the hosting service provider shall translate such documents into the languages that their users and affected parties understand.
  5. The hosting service provider shall provide for effective remedies, such as the restoration of content, apology, rectification and compensation for damages. 
  6. The hosting service provider shall take reasonable and proportionate measures to ensure that their terms and conditions are applied and enforced consistently and in compliance with applicable procedural safeguards. The prohibition of discrimination may under certain circumstances require that the hosting service provider make special provisions for certain users or groups of users in order to correct existing inequalities.
  7. For hosting service providers with more than 5 million monthly active users in the Union, the process of drafting and applying terms of service agreements, community standards and content-restriction policies should be transparent, accountable and inclusive. Such hosting service providers should seek to collaborate and negotiate with consumer associations, human rights advocates and other organisations representing the interests of users and affected parties, as well as with data protection authorities before adopting and modifying their policies.

Article 15
The right to notify under terms and conditions

  1. The hosting service provider shall establish an easily accessible and user-friendly mechanism that allows natural or legal persons to notify the hosting service provider regarding content that violates the terms and conditions on content moderation. Such mechanism shall allow for such notification by electronic means and shall be easy to use.

Article 16
Standards for notices of content violating terms and conditions

  1. The hosting service provider shall ensure that notifiers can submit notices which are sufficiently precise and adequately substantiated to inform the hosting service provider about a violation of its terms and conditions. For that purpose, notices under the terms and conditions shall at least contain the following elements:
    1. an explanation of why the content is notified;
    2. substantiation of the claim, where possible;
    3. a clear indication of the exact location of the allegedly content (URL and timestamp where appropriate);
    4. a declaration of good faith that the information provided is accurate.

Article 17
The right of a notifier under terms and conditions to be informed

  1. Upon receipt of a notice under terms and conditions, the hosting service provider shall immediately send or display a confirmation of receipt and a copy of the information provided under Article 15 to the notifier under terms and conditions.
  2. Upon the decision to act or not on the basis of a notice under terms and conditions, the hosting service provider informs the notifier accordingly, if contact information is available.

Article 18
Moderating content under terms and conditions

  1. The hosting service provider shall promptly inform the content provider upon receipt of a notice under terms of conditions or upon content moderation by the hosting service provider on its own initiative under its terms and conditions.
  2. Content that has been notified under terms and conditions shall remain visible while the assessment of is still pending. The hosting service provider shall take a decision within 7 days upon receipt of a notice.
  3. When disabling access to content in line with their own policies, the hosting service provider should do so in a transparent and non-discriminatory manner.
  4. Upon receipt of a decision from the competent independent dispute settlement body, the hosting service provider shall expeditiously comply.
  5. In all cases, the final decision shall be undertaken by qualified staff, to whom access to psychological assistance is effectively provided.

Article 19
The right to be informed

  1. The hosting service provider shall inform the content provider and the notifier of the content moderation decision taken under its terms and conditions and the reasoning behind it.
  2. The hosting service provider shall inform the content provider about the possibility to issue a counter notice. The hosting service provider shall inform the content provider and the notifier under terms and conditions, how to appeal a decision by either party through independent dispute settlement pursuant to Article 22.
  3. The first paragraph of this Article shall only apply if the content provider and notifier under terms and conditions have supplied sufficient contact details to the hosting service provider.

Article 20
The right to counter notice in content moderation procedures

  1. Hosting service providers shall establish effective and accessible mechanisms allowing content providers to submit a counter notice requesting that said content remains visible or is reinstated if:
    1. their content has been notified by users under the terms and conditions;
    2. their content has been removed, demoted or access to it has been disabled on the hosting service provider’s own initiative under the terms and conditions;
    3. their content has in any other way been affected by content moderation.
  2. The hosting service provider shall promptly assess the counter notice and take due account of it.
  3. If the counter notice provides reasonable grounds to consider that the content does not violate terms and conditions, hosting service providers may decide not to act on the basis of a notice under the terms and conditions. The hosting service provider shall ensure that the provider of the counter notice and the provider of the initial notice are informed about the follow-up to a counter notice, where their contact details were provided.
  4. Paragraphs 1 to 3 of this Article are without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.

Article 21
Complaint mechanism

  1. Once the hosting service provider has taken a decision as regards notified content under terms and conditions, both the content provider and the notifier shall have the right to submit a complaint to the hosting service provider.
  2. A complaint shall at least contain the following elements:
    1. an explanation of the grounds for the complaint, accompanied, where possible, with the legal basis for the assessment of the content.
    2. proof that substantiates the complaint.
    3. a declaration of good faith that the information provided is accurate.

Article 22
Independent dispute settlement

  1. Member States shall establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse to appeal decisions on content moderation and actions under terms and conditions.
  2. Each  Member  State  shall  ensure  that  each  independent dispute settlement  body is  provided  with  the  human,  technical  and financial  resources,  premises  and  infrastructure  necessary  for  the  effective  and timely performance  of  its  tasks.
  3. The Commission shall, by means of delegated acts pursuant to Article 28, establish a fund to assist the Member States in financing the running costs of the independent dispute settlement bodies. This fund shall be funded by fines imposed on hosting service providers for non-compliance with the provisions of this Regulation as well as a contribution by hosting service providers with significant market powers.
  4. The independent dispute settlement bodies shall meet the following requirements:
    1. they are composed of legal experts;
    2. their composition is balanced, in particular with regards to racial or ethnic origin, gender, age, and socio-economic status;
    3. they are impartial and independent and therefore not subject to any instructions from either party or their representatives that are involved in the dispute:
    4. they are providing their services in the language of the terms and conditions which govern the contractual relationship between the provider of hosting services and the user concerned;
    5. they are easily accessible either physically in the place of establishment or residence of the user, or remotely, using communication technologies;
    6. they are capable of providing their mediation services without undue delay;
    7. they are able to provide resolution of cases within 7 days upon receipt of the duly completed referral form;
    8. they make publicly available information on the legal effect of the outcome of the independent dispute resolution procedure, including penalties for non-compliance where decisions have binding effect on the parties.
  5. The referral of a question regarding content moderation to an independent dispute settlement body shall be without prejudice to the right to accessible and independent recourse to judicial redress before a court, to effective remedy and to a fair trial enshrined in Article 47 of the Charter.

Article 23
Procedural rules for independent dispute settlement

  1. The content provider, notifiers and non-profit entities with a legitimate interest in defending freedom of expression and information shall have the right to refer a question of content moderation under terms and conditions to the competent independent dispute settlement body in case of a dispute regarding a content moderation decision taken by the hosting service provider.
  2. The independent dispute settlement body shall have jurisdiction over disputes arising out of notices submitted in the Member State where this body is located. Notwithstanding this, a natural person may refer a question of content moderation to the independent dispute settlement body of the Member State of domicile. 
  3. The hosting service provider shall comply with the decision of the independent dispute settlement body without undue delay.

Article 24
Sanctions

  1. Hosting service providers acting under the provisions of this Chapter shall not incur liability for providing access to legal third-party content.
  2. Legal obligations in this Chapter are without prejudice to Articles 12, 13 and 14 of Directive on electronic commerce 2000/31/EC.
  3. Member States shall lay down rules on administrative sanctions for systemic violation of obligations provided by Articles 22 and 23, national authorities shall be competent to impose administrative fines. Such sanctions shall be effective, proportionate and dissuasive.

Chapter V

Transparency Obligations

Article 25
The right to transparency on notice and action procedures

  1. Hosting service providers shall provide clear information about their notice and action mechanism described in Article 3 in their terms and conditions.
  2. Hosting service providers shall publish annual reports in a standardised and machine-readable format including:
    1. the number of all notices received under the notice and action system categorised by the type of content;
    2. information about the number and type of illegal content which has been removed or access to it disabled, including the corresponding timeframes per type of content;
    3. the specific legal basis for removals;
    4. the rejection rates and the number of pieces of content that were restored.
    5. the type of notifiers that issued the notices (private individuals, organisations, corporations, etc.) and the total number of their notices, where such details are provided;
    6. information about the number of complaint procedures, contested decisions and actions taken by the hosting service providers;
    7. information about the number of redress procedures initiated and decisions taken by the competent authority in accordance with national law;
    8. the type of measures adopted with regards to repeated infringers to ensure that the measures are effective in tackling systemic abusive behaviour;
    9. aggregated information on the types and amount of data that were shared with law enforcement authorities as a result of investigations into illegal content removed by the hosting provider.
  3. The obligations set out in this Article shall only apply to large hosting service providers with more than 5 million monthly active users in the Union. A supervisory authority may decide to extend these obligations to smaller and medium-size companies and impose tiered obligations on companies of different sizes as needed.
  4. The Commission may, by means of delegated acts pursuant to Article 28, provide templates for transparency reporting, to ensure consistency across the single market and over time.

Article 26
Transparency obligations for Member States

Member States shall publish annual reports including:

  1. the number of all removal orders issued to hosting service providers categorised by type of content;
  2. information on the number of cases of successful detection, investigation and prosecution of offences that were notified to hosting service providers.

Article 27
The right to transparency on content moderation practices under terms and conditions

  1. Hosting service providers shall provide clear information about their content moderation practices.
  2. Hosting service providers shall publish annual reports in a standardised and machine-readable format including:
    1. a description of the procedures for the public to complain about possible violations of terms and conditions;
    2. the number of all notices and flags received under their terms and conditions categorised by the type of content and Member State;
    3. information about the number and type of content which has been removed, delayed, demoted or access to it disabled;
    4. The specific basis for the removal or disabling of content;
    5. the number of content moderation decisions that were reversed or otherwise corrected;
    6. the type of notifiers that issued the notices or flags (private individuals, organisations, corporations, trusted flaggers etc.), where such details are provided, and the total number of their notices or flags;
    7. the description of the content moderation model applied by the hosting service provider, which includes but is not limited to the internal decision-making process, the number of staff employed for content moderation, including their location, minimum education and language skills for such employees, as well as the existence, process, rationale, reasoning and possible outcome of any automated systems used.
  3. The Commission shall, by means of delegated acts pursuant to Article 28, define a direct data access regime for qualified independent researchers and national supervisory authorities based on the technical functionalities of hosting service providers, including high-level aggregate metrics of content moderation practices.
  4. The obligations set out in this Article shall only apply to large hosting service providers with more than 5 million monthly active users in the Union. A supervisory authority may decide to extend these obligations to smaller and medium-size companies and impose tiered obligations on companies of different sizes as needed.

Chapter VI

Article 28
Delegated acts

  1. The power to adopt delegated acts is conferred on the Commission subject to the conditions laid down in this Article.
  2. The delegation of power referred to in Article 3(3), Article 21(3), Article 25(4) and Article 27(3) shall be conferred on the Commission for an indeterminate period of time from [date of entering into application of this law].
  3. The delegation of power referred to in Article 3(3), Article 21(3), Article 25(4) and Article 27(3) may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force.
  4. As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council.
  5. A delegated act adopted pursuant to Article 3(3), Article 21(3), Article 25(4) and Article 27(3) shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.

ANNEX I

REMOVAL ORDER FOR ILLEGAL CONTENT

In accordance with …… (legal basis) the addressee of the removal order shall remove or disable access to the illegal content.

SECTION A:

Issuing Member State:


NB: details of issuing authority to be provided at the end (Sections E and F) 
Addressee (legal representative) 


Addressee (contact point) 


Member State of jurisdiction of addressee: [if different to issuing state]


Time and date of issuing the removal order 


Reference number of the removal order:


SECTION B:

Content to be removed or access to it disabled without undue delay: 

A URL and any additional information enabling the identification and exact location of the content referred:


Reason(s) explaining why the content is considered illegal content:


Additional information on the reasons why the content is considered illegal content (optional):


SECTION C:

Information to content provider 

Please note that (tick, if applicable):

□ for reasons of public security, the addressee must refrain from informing the content provider whose content is being removed or or to which access has been disabled.

Otherwise: Details of possibilities to contest the removal order in the issuing Member State (which can be passed to the content provider, if requested) under national law; see Section G below:

SECTION D:

Informing Member State of jurisdiction

□ Tick if the state of jurisidiction of the addressee is other than the issuing Member State:

□ a copy of the removal order is sent to the relevant competent authority of the state of jurisdiction 

SECTION E:

Details of the authority which issued the removal order 

The type of authority which issued this removal order (tick the relevant box):

□ judge, court, or investigating judge 

□ law enforcement authority

Details of the issuing authority and/or its representative certifying the removal order as accurate and correct:

Name of authority:


Name of its representative:


Post held (title/grade):


File No:


Address:


Tel. No: (country code) (area/city code)


Fax No: (country code) (area/city code)


Email:


Date:


Official stamp (if available) and signature:

SECTION F:

Contact details for follow-up Contact details where issuing authority can be reached to receive feedback on time of removal or to provide further clarification:


Contact details of the authority of the state of jurisdiction of the addressee [if different to the issuing Member State]


SECTION G:

Information about redress possibilities Information about competent body or court, deadlines and procedures including formal requirements for contesting the removal order: Competent body or court to contest the removal order:


Deadline for contesting the decision: Xxx months starting from xxxx
Link to provisions in national legislation:


ANNEX II

FEEDBACK FORM FOLLOWING REMOVAL OF ILLEGAL CONTENT (Article 8 (7) of Regulation (EU) xxx)

SECTION A:

Addressee of the removal order:


Authority which issued the removal order: 


File reference of the issuing authority:


File reference of the addressee:


Time and date of receipt of removal order:


SECTION B:

The illegal content, subject to the removal order has been (tick the relevant box):

□ removed 

□ disabled 

Time and date of removal or disabling access:


SECTION C:

Details of the addressee

Name of the hosting service provider/ legal representative:


Member State of main establishment or of establishment of the legal representative:


Name of the authorised person:


Details of contact point (Email):


Date:


ANNEX III

INFORMATION ON THE IMPOSSIBILITY TO EXECUTE THE REMOVAL ORDER (Article 8 (8) and (9) of Regulation (EU) xxx)

SECTION A:

Addressee of the removal order: 


Authority which issued the removal order: 


File reference of the issuing authority:


File reference of the addressee:


Time  and date of receipt of removal order: 


SECTION B:

Reasons for non-execution

(i) The removal order cannot be executed or cannot be executed within the requested deadline for the following reason(s):

force majeure or de facto impossibility not attributable to the addressee or the service provider, including for technical or operational reasons [Am. 155] 

□ the removal order contains manifest errors

□ the removal order does not contain sufficient information 

(ii) Please provide further information as to the reasons for non-execution:


(iii) If the removal order contains manifest errors and/or does not contain sufficient information, please specify which errors and what further information or clarification is required:


SECTION H:

Details of the service provider / its legal representative

Name of the service provider/ legal representative:


Name of the authorised person:


Contact details (Email):


Signature:


Time and date:


Bitte teilen!

Nach oben scrollen