Notice and Action
Right now, there are no clear rules for online platforms, such as YouTube, Twitter, Facebook or Instagram, and how they should deal with user complaints and potentially illegal content online.
Over the past year, the EU Parliament has been calling for legislation to regulate online platforms. The European Commission kept postponing such legislation for the past 7 years, while very few Member States have actually introduced national laws.
As a consequence, online platforms started creating their own rules, based on community standards, which resulted many times in the spread of hateful and other problematic content, but also in an over-removal of lawful content online.
For this reason, the Greens/EFA group in the European Parliament is drafting new EU legislation for a notice and action mechanism. The draft legislative text aims at providing a basis for future EU legislation. It is open for consultation for the public until 1 November.
To give your feedback on the model rules please use the comment section. You can also up or down vote specific paragraphs or other people’s comments.
Two ways to give feeback:
- You can leave a general remark concernig the text as whole here.
- You can amend single paragraphs using the plus icons. Furthermore, you can comment while reading (and don’t have to scroll to the very bottom). You can even discuss existing annotations.
Regulation on procedures for notifying and acting on illegal content hosted by information society services
(“Regulation on Notice and Action procedures”)
Subject matter and objective
- This Regulation lays down rules to establish procedures for notifying and acting on illegal content hosted by information society services to and ensure their proper functioning.
- This Regulation seeks to contribute to the proper functioning of the internal market by ensuring the free movement of intermediary information society services in full respect of the Charter on Fundamental Rights, in particular its Article 11 on the freedom of expression and information.
For the purpose of the Regulation:
- ‘hosting service provider’ means any information society service provider that hosts, stores, selects, references publicly available content or distributes content publicly provided by a user of the service.
- ‘content’ means any concept, expression or information in any format such as text, images, audio and video.
- ‘illegal content’ refers to ‘illegal information’ found illegal under the law of the Member State where it is hosted.
- ‘content moderation’ means the practice of sorting content provided by a recipient of a service by applying a pre-determined set of rules and guidelines in order to ensure that the content complies with legal and regulatory requirements, and terms and conditions, as well as any resulting measure taken by the hosting service provider, such as removal of the content or the deletion or suspension of the user’s account.
- ‘content provider’ refers to any recipient of a hosting service.
- ‘notice’: means any communication about illegal content addressed to a hosting service provider with the objective of obtaining the removal of or the disabling of access to that content.
- ‘notice provider’: means any natural or legal person submitting a notice.
- ‘counter notice’: means a notice through which the content provider challenges a claim of illegality as regards content provided by the said provider.
- ‘terms and conditions’ means all terms, conditions or rules, irrespective of their name or form, which govern the contractual relationship between the content hosting platform and its users and which are unilaterally determined by the hosting service provider.
Notifying Illegal Content
The right to notify
The hosting service provider shall establish an easily accessible and user-friendly mechanism that allows natural or legal persons to notify the concerned illegal content hosted by that provider. Such mechanism shall allow for notification by electronic means.
Standards for notices
- Hosting service providers shall ensure that notice providers can submit notices which are sufficiently precise and adequately substantiated to enable the hosting service provider to take an informed decision about the follow-up to that notice. For that purpose, notices shall at least contain the following elements:
- an explanation of the reason on which the content can be considered illegal.
- proof or any other documentation to support the claim and potential legal grounds.
- a clear indication of the exact location of the illegal content (URL and timestamp where appropriate).
- a declaration of good faith that the information provided is accurate.
- Notice providers shall be given the choice whether to include their contact details in the notice. Where they decide to do so, their anonymity towards the content provider shall be ensured; the identification of notice providers should however be provided in cases of violations of personality rights or of intellectual property rights.
- Notices shall not automatically trigger legal liability nor should they impose any removal requirement for specific pieces of the content or for the legality assessment.
Acting on Illegal Content
The right of a notice provider to be informed
- Upon receipt of a notice, the hosting service provider shall immediately send a confirmation of receipt to the notice provider.
- Upon the decision to act or not on the basis of a notice, the hosting service provider informs the notice provider accordingly.
The right of a content provider to be informed
- Upon receipt of a notice, the hosting service provider shall immediately inform the content provider about the notice, the decision taken and the reasoning behind hosting service providerit, how such decision was made, if the decision was made by a human only or aided by an automated content moderation tools, about the possibility to issue a counter-notice and how to appeal a decision by either party with the hosting service provider, courts or other entities.
- The first paragraph of this article shall only apply if the content provider has supplied sufficient contact details to the hosting service provider.
- The first paragraph shall not apply if justified reasons of public policy and public security as provided for by law and in particular where this would run counter to the prevention and prosecution of serious criminal offences. In this case the information should be provided as soon as the reasons not to inform cease to exist or an investigation has concluded.
Complaint and redress mechanism
- Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled by manual or automated means to submit a counter-notice against the action of the hosting service provider requesting reinstatement of their content.
- Hosting service providers shall promptly examine every complaint that they receive and reinstate the content without undue delay where the removal or disabling of access was unjustified. They shall inform the complainant about the outcome of the examination.
- The hosting service provider shall ensure that, upon receipt of a notice, the provider of that content shall have the right to contest any action by issuing a counter-notice.
- The hosting service provider shall promptly assess the counter-notice and take due account of it.
- If the counter-notice has provided reasonable grounds to consider that the content is not illegal, hosting service providers may decide not to act or to restore, without undue delay, access to the content that was removed or to which access was temporarily disabled, and inform the content provider of such restoration.The hosting service provider shall ensure that the provider of the counter-notice and the provider of the initial notice are informed about the follow-up to a counter-notice, granted their contact details were provided.
- Paragraphs 1 to 3 of this article are without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.
Transparency on Notice and Action Procedures
The right of transparency on notice-and-action procedures
- Hosting service providers shall provide clear information about their notice and action procedures in their terms and conditions.
- Hosting service providers shall publish annual reports in a standardized format including:
- the number of all notices received under the notice-and-action system categorised by the type of content.
- information about the number and type of illegal content which has been removed or access to it disabled, including the corresponding timeframes per type of content.
- the number of erroneous takedowns.
- The type of entities that issued the notices (private individuals, organisations, corporations, trusted flaggers, Member State and EU bodies, etc.) and the total number of their notices.
- information about the number of complaint procedures, contested decisions and actions taken by the hosting service providers;
- the description of the content moderation model applied by the hosting service provider, including the existence, process, rationale, reasoning and possible outcome of any automated systems used.
- information about the number of redress procedures initiated and decisions taken by the competent authority in accordance with national law.
- the measures they adopt with regards to repeated infringers to ensure that the measures are effective in tackling such systemic abusive behaviour.
- When automatic content moderation tools are used, transparency and accountability shall be ensured by independent and impartial public oversight. To that end, national supervisory authorities shall have access to the software documentation, the facilities, and the datasets, hardware and software used.
Transparency obligations for Member States
Member States shall publish annual reports including:
- the number of all judicial procedures issued to hosting service providers categorised by type of content.
- information on the number of cases of successful detection, investigation and prosecution of offences that were notified to hosting service providers by notice providers and Member State authorities.
Independent Dispute Settlement
Independent dispute settlement
- Member States shall establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse to appeal decisions on content moderation.
- The independent dispute settlement bodies shall meet the following requirements:
- they are composed of legal experts while taking into account the principle of gender balance;
- they are impartial and independent;
- their services are affordable for users of the hosting services concerned;
- they are capable of providing their services in the language of the terms and conditions which govern the contractual relationship between the provider of hosting services and the user concerned;
- they are easily accessible either physically in the place of establishment or residence of the business user, or remotely using communication technologies;
- they are capable of providing their mediation services without undue delay.
- The referral of a question regarding content moderation to an independent dispute settlement body shall be without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.
Procedural rules for independent dispute settlement
- The content provider, the notice provider and non-profit entities with a legitimate interest in defending freedom of expression and information shall have the right to refer a question of content moderation to the competent independent dispute settlement body in case of a dispute regarding a content moderation decision taken by the hosting service provider.
- As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the notice has been filed. For natural persons, it should always be possible to bring complaints to the independent dispute settlement body of the Member States of residence.