Mechanismus „zjištění a opatření“ (Notice and Action)

Velmi děkujeme všem, kteří nám poslali své podněty a komentáře – přes webovou stránku jsme jich obdrželi 78 a mnoho z vás nás kontaktovalo i přímo mailem. 

Sekce komentářů je již uzavřena a my pracujeme na aktualizované verzi modelových pravidel “zjištění a opatření” (Notice and Action) pro připravovaný akt o digitálních službách. Brzy zde představíme finální text.

Zelení/EFA v Evropském parlamentu připravují novou legislativu pro mechanismus „zjištění a opatření“, jejímž cílem je vytvoření a poskytnutí základních stavebních kamenů pro budoucí právní předpisy na evropské úrovni. Do konzultace pro veřejnost se můžete zapojit do 1. listopadu.

Pro poskytnutí zpětné vazby na modelová pravidla můžete využít sekci komentářů. Zároveň můžete hlasovat „pro“ a „proti“ skrze jednotlivé odstavce a již existující komentáře.

V současné době neexistují žádná jasná a jednotná pravidla pro on-line platformy (YouTube, Twitter, Facebook nebo Instagram), která by definovala postup řešení nejen stížností uživatelů, ale i nakládání se samotným nelegálním obsahem.

Již v minulých letech požadoval Evropský parlament legislativu upravující regulaci on-line platforem. Nicméně tato legislativa byla ze strany Evropské komise odkládána posledních 7 let. A to i přes fakt, že jen velmi málo členských zemí zavedlo vlastní vnitrostátní právní předpisy upravující tuto problematiku.

Jako důsledek absence jednotné legislativy, platformy vytváří svá vlastní pravidla založená na komunitních standardech. To však velmi často vede nejen k nadměrnému odstraňování zákonného obsahu on-line, ale také k šíření nenávistného a jiného problematického obsahu.

Dva způsoby, jak můžete dát zpětnou vazbu:

  1. Zde můžete zanechat obecný feedback týkající se textu návrhu jako celku.
  2. Jednotlivé odstavce pak můžete upravovat pomocí plusových ikon. Zároveň můžete jednotlivé odstavce komentovat, a to i během čtení (nemusíte tedy pro přidání komentáře nikam scrollovat). V neposlední řadě je možné diskutovat o již existujících anotacích.
Description inline commenting

Regulation on procedures for notifying and acting on illegal content hosted by information society services

(“Regulation on Notice and Action procedures”)

Chapter I

General Provisions

Article 1 Subject matter and objective

  1. This Regulation lays down rules to establish procedures for notifying and acting on illegal content hosted by information society services to and ensure their proper functioning.
  2. This Regulation seeks to contribute to the proper functioning of the internal market by ensuring the free movement of intermediary information society services in full respect of the Charter on Fundamental Rights, in particular its Article 11 on the freedom of expression and information.

Article 2 Definitions

For the purpose of the Regulation:
  1. ‘hosting service provider’ means any information society service provider that hosts, stores, selects, references publicly available content or distributes content publicly provided by a user of the service.
  2. ‘content’ means any concept, expression or information in any format such as text, images, audio and video.
  3. ‘illegal content’ refers to ‘illegal information’ found illegal under the law of the Member State where it is hosted.
  4. ‘content moderation’ means the practice of sorting content provided by a recipient of a service by applying a pre-determined set of rules and guidelines in order to ensure that the content complies with legal and regulatory requirements, and terms and conditions, as well as any resulting measure taken by the hosting service provider, such as removal of the content or the deletion or suspension of the user’s account.
  5. ‚content provider‘ refers to any recipient of a hosting service.
  6. ‚notice‘: means any communication about illegal content addressed to a hosting service provider with the objective of obtaining the removal of or the disabling of access to that content.
  7. ‚notice provider‘: means any natural or legal person submitting a notice.
  8. ‘counter notice’: means a notice through which the content provider challenges a claim of illegality as regards content provided by the said provider.
  9. ‘terms and conditions’ means all terms, conditions or rules, irrespective of their name or form, which govern the contractual relationship between the content hosting platform and its users and which are unilaterally determined by the hosting service provider.

Chapter II

Notifying Illegal Content

Article 3 The right to notify

The hosting service provider shall establish an easily accessible and user-friendly mechanism that allows natural or legal persons to notify the concerned illegal content hosted by that provider. Such mechanism shall allow for notification by electronic means.

Article 4 Standards for notices

  1. Hosting service providers shall ensure that notice providers can submit notices which are sufficiently precise and adequately substantiated to enable the hosting service provider to take an informed decision about the follow-up to that notice. For that purpose, notices shall at least contain the following elements:
    1. an explanation of the reason on which the content can be considered illegal.
    2. proof or any other documentation to support the claim and potential legal grounds.
    3. a clear indication of the exact location of the illegal content (URL and timestamp where appropriate).
    4. a declaration of good faith that the information provided is accurate.
  1. Notice providers shall be given the choice whether to include their contact details in the notice. Where they decide to do so, their anonymity towards the content provider shall be ensured; the identification of notice providers should however be provided in cases of violations of personality rights or of intellectual property rights.
  2. Notices shall not automatically trigger legal liability nor should they impose any removal requirement for specific pieces of the content or for the legality assessment.

Chapter III

Acting on Illegal Content

Article 5 The right of a notice provider to be informed

  1. Upon receipt of a notice, the hosting service provider shall immediately send a confirmation of receipt to the notice provider.
  2. Upon the decision to act or not on the basis of a notice, the hosting service provider informs the notice provider accordingly.

Article 6 The right of a content provider to be informed

  1. Upon receipt of a notice, the hosting service provider shall immediately inform the content provider about the notice, the decision taken and the reasoning behind hosting service providerit, how such decision was made, if the decision was made by a human only or aided by an automated content moderation tools, about the possibility to issue a counter-notice and how to appeal a decision by either party with the hosting service provider, courts or other entities.
  2. The first paragraph of this article shall only apply if the content provider has supplied sufficient contact details to the hosting service provider.
  3. The first paragraph shall not apply if justified reasons of public policy and public security as provided for by law and in particular where this would run counter to the prevention and prosecution of serious criminal offences. In this case the information should be provided as soon as the reasons not to inform cease to exist or an investigation has concluded.

Article 7 Complaint and redress mechanism

  1. Hosting service providers shall establish effective and accessible mechanisms allowing content providers whose content has been removed or access to it disabled by manual or automated means to submit a counter-notice against the action of the hosting service provider requesting reinstatement of their content.
  2. Hosting service providers shall promptly examine every complaint that they receive and reinstate the content without undue delay where the removal or disabling of access was unjustified. They shall inform the complainant about the outcome of the examination.
  3. The hosting service provider shall ensure that, upon receipt of a notice, the provider of that content shall have the right to contest any action by issuing a counter-notice.
  4. The hosting service provider shall promptly assess the counter-notice and take due account of it.
  5. If the counter-notice has provided reasonable grounds to consider that the content is not illegal, hosting service providers may decide not to act or to restore, without undue delay, access to the content that was removed or to which access was temporarily disabled, and inform the content provider of such restoration.The hosting service provider shall ensure that the provider of the counter-notice and the provider of the initial notice are informed about the follow-up to a counter-notice, granted their contact details were provided.
  6. Paragraphs 1 to 3 of this article are without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.

Chapter IV

Transparency on Notice and Action Procedures

Article 8 The right of transparency on notice-and-action procedures

  1. Hosting service providers shall provide clear information about their notice and action procedures in their terms and conditions.
  2. Hosting service providers shall publish annual reports in a standardized format including:
    1. the number of all notices received under the notice-and-action system categorised by the type of content.
    2. information about the number and type of illegal content which has been removed or access to it disabled, including the corresponding timeframes per type of content.
    3. the number of erroneous takedowns.
    4. The type of entities that issued the notices (private individuals, organisations, corporations, trusted flaggers, Member State and EU bodies, etc.) and the total number of their notices.
    5. information about the number of complaint procedures, contested decisions and actions taken by the hosting service providers;
    6. the description of the content moderation model applied by the hosting service provider, including the existence, process, rationale, reasoning and possible outcome of any automated systems used.
    7. information about the number of redress procedures initiated and decisions taken by the competent authority in accordance with national law.
    8. the measures they adopt with regards to repeated infringers to ensure that the measures are effective in tackling such systemic abusive behaviour.
  3. When automatic content moderation tools are used, transparency and accountability shall be ensured by independent and impartial public oversight. To that end, national supervisory authorities shall have access to the software documentation, the facilities, and the datasets, hardware and software used.

Article 9 Transparency obligations for Member States

Member States shall publish annual reports including:

  1. the number of all judicial procedures issued to hosting service providers categorised by type of content.
  2. information on the number of cases of successful detection, investigation and prosecution of offences that were notified to hosting service providers by notice providers and Member State authorities.

Chapter V

Independent Dispute Settlement

Article 10 Independent dispute settlement

  1. Member States shall establish independent dispute settlement bodies for the purpose of providing quick and efficient extra-judicial recourse to appeal decisions on content moderation.
    1. The independent dispute settlement bodies shall meet the following requirements:
    2. they are composed of legal experts while taking into account the principle of gender balance;
    3. they are impartial and independent;
    4. their services are affordable for users of the hosting services concerned;
    5. they are capable of providing their services in the language of the terms and conditions which govern the contractual relationship between the provider of hosting services and the user concerned;
    6. they are easily accessible either physically in the place of establishment or residence of the business user, or remotely using communication technologies;
    7. they are capable of providing their mediation services without undue delay.
  2. The referral of a question regarding content moderation to an independent dispute settlement body shall be without prejudice to the right to an effective remedy and to a fair trial enshrined in Article 47 of the Charter.

Article 11 Procedural rules for independent dispute settlement

  1. The content provider, the notice provider and non-profit entities with a legitimate interest in defending freedom of expression and information shall have the right to refer a question of content moderation to the competent independent dispute settlement body in case of a dispute regarding a content moderation decision taken by the hosting service provider.
  2. As regards jurisdiction, the competent independent dispute settlement body shall be that located in the Member State in which the notice has been filed. For natural persons, it should always be possible to bring complaints to the independent dispute settlement body of the Member States of residence.

Diskuze je uzavřena.

79 poznámky

  • Obecné poznámky

  • Přímé poznámky

  • 0
    0

    The whole thing is rather pointless as-is.
    It establishes a notice-and-takedown procedure, as well as the possibility to file counter notices and appeal to an alternate court system. But it doesn’t (as written now) change the current legal situation that hosting service providers are liable for the content they’re hosting on behalf of content providers, and therefore will aggressively purge content that might be illegal.

    In fact, even if a content provider files a counter notice, and the „independent dispute settlement body“ decides that the content should be reinstated, the hosting service provider WILL STILL BE LIABLE in case an actual court decides that the content in question was, indeed, illegal! Luckily, the hosting service provider is free to ignore any decision by that „independent dispute settlement body“, and therefore can and will simply keep the content offline to avoid that liability. IOW: the status quo.

    What would be needed is:
    1) the counter notice including an explicit affirmation that the content provider considers the content legal and is prepared to face legal responsibility for publicizing the content.
    2) the option for the hosting service provider to verify the identity of the content provider in a matter suitable to get them into an actual court.
    3) an explicit indemnification of the hosting service provider for hosting the content, provided 1) and 2) have occurred.

    …and none of that „independent dispute settlement“ stuff. There’s noting to settle, because the problem is not a dispute between the content provider and the hosting service provider. The problem is determining the legality of the content. And that can, by definition, only be done by a proper court.
    (Omitting all the other problems with mandatory independent settlement schemes for brevity.)

  • 0
    0

    Wenn ich keine Möglichkeit habe, den Text auf Deutsch zu lesen, finde ich es schwierig, etwas so Komplexes wie einen Gesetzestext/eine Richtlinie zu verstehen oder zu kommentieren. Bitte führen Sie eine Sprachauswahl für Ihre Nutzer ein, sonst bleibt Ihre Initiative nur etwas für Leute, die Englisch perfekt beherrschen, also elitär und undemokratisch. Mindestens Deutsch (ca. 100 Mio. Muttersprachler) und Französisch (ca. 70 Mio Muttersprachler) in Europa sollten ebenfalls zur Verfügung stehen

    • 0
      0

      Vielen Dank für den Kommentar und das Interesse – wir arbeiten gerade an der Überetzung der Webseite, die nun unter https://www.greens-efa.eu/mycontentmyrights/de/ zu erreichen ist. Leider haben wir zurzeit keine weiteren Kapazitäten, den Gesetzentwurf selbst zu übersetzen. Wäre es möglich, wenn Sie einen allgemeinen Kommentar mit Ihren Ideen zusammenfassen könnten, die wir dann gerne einarbeiten?

  • 2
    0

    A section on transparency in relation to content moderation that does not deal with illegal content might be useful. A lot of the content reported as being illegal is likely to be removed on the basis of terms of service, so some transparency/good practice might be useful.

  • 0
    0

    ILGA-Europe supports the position of civil society organisations such as EDRi and Access Now that a harmonised, transparent and rights-protective notice-and-action framework should be required to be an integral part of any IT platform. Such a framework will empower users to flag potentially illegal content and set clear, transparent and predictable requirements for intermediaries, and will require providers to put processes in place to deal responsibly and timely with such notifications with due regard for users’ free expression rights.

    At the same time, ILGA-Europe believes that the proposed draft legislation should be accompanied with strong action by EU institutions and Member States against any use of hate speech, in especially by public leaders and politicians, including the clear condemnation of such hatred as incitement to discrimination, hostility or violence towards vulnerable groups, including against LGBTI people.

    In this regard, ILGA-Europe calls on Member States to:
    • Review existing laws against hate speech and hate crimes to ensure that it covers all hate speech, including online and clearly prohibits the incitement to discrimination, hostility and violence based on all grounds, including sexual orientation, gender identity and gender expression and sex characteristics; or develop ‘hate speech’ legislation that meet the requirements of legality, necessity and proportionality, and legitimacy, and subject such rulemaking to robust public participation;
    • Actively consider and deploy good governance measures, including those recommended the Rabat Plan of Action , to tackle hate speech with the aim of reducing the perceived need for bans on expression;
    • Establish or strengthen independent judicial mechanisms to ensure that individuals may have access to justice and remedies when suffering cognisable harm from hate crime and speech, including in online world.

    In addition, to further harmonisation of EU legislation and ensure full protection of vulnerable groups against hate speech both in offline and online worlds, the European Commission should assess EU legislation on how it protects all people from all forms of hate speech. The Commission, working with future EU Presidencies, Member States and the European Parliament, should explore grounds for legislative measures on prohibition of incitement to discrimination, hostility and violence based on all grounds, including SOGIESC.

    National and European Union legislation on prohibition of incitement should be based on clear and accurate definitions of the concepts of “incitement”, “discrimination”, “violence” and “hostility”. All definitions should be based on sound and consistent human rights standards. In this regard, legislation can draw, inter alia, from the guidance and definitions provided in the Camden Principles.

  • 0
    0

    We would like to raise three concerns. These three issues have not or not extensively enough.

    1. The regulation should exclude public authorities as notifiers. Those public authorities often have the power to take down content that is considered to be illegal already. Of they don’t have that power, that was probably for a good reason. This power comes with all kinds of safeguards. For example, to order the take down of content Dutch police first needs to obtain a warrant from a judge. If they were to be allowed to notify a content hosting provider based on this regulation and content is subsequently taken down, the authority would have an easy way to circumvent safeguards.

    2. The regulation should ensure proportionality when content is taken down. What if the content that is manifestly illegal can only be taken down by making unavailable an entire website or even server? Should the provider take down illegal content and by doing so, take down other, legal, information in the process? Or should all information remain accessible, including the illegal content? This can happen when the hosting provider only provides the rack space and network connectivity for server, while not having access to the data stored on the server („colocation“ is the term often used for this kind of service).

    3. The regulation should differentiate between the different types of content. The impact of an image of sexual exploitation of children is different from the impact of an infringement of copyright. For example, when receiving of a notification of an image of sexual exploitation of children an immediate take down rather than waiting for a response to the notification of the content provider is probably justified. On the other hand, in the case of an alleged infringement of copyright not much (additional) harm is made when the hosting service provider would wait for a response of the content provider. The same applies to the previous point: the type of content will influence the proportionality.

    Furthermore we would like to support the comments made by others, in particular those of Access Now, Electronic Frontier Foundation and Joe McNamee.

  • 0
    0

    true, and should be redundant (or merged with) with e)

  • 0
    0

    Should minimum quality standards be imposed for terms of service?Reference

  • 0
    0

    …notify THE HOSTING SERVICE PROVIDER ABOUT the concerned illegal content…Reference

  • 1
    0

    „illegal content“ should be replaced by „disputed content“, or „claimed illegal content“ or somesuch throughout. A big reason why we need a notice-and-action regulation, is to reduce hosting service providers‘ need to make decisions about legality.Reference

  • 0
    0

    borked sentence:
    „…and the reasoning behind hosting service PROVIDERIT, how such decision was made, if..“
    Also, that sentence is far too long.
    Furthermore, the sentence currently includes „upon receipt of a notice, the hosting service provider shall immediately inform the content provider about [..] the decision taken“, implying decisions have to be immediately after receipt of notice, or otherwise granting wriggle room for notifying not immediately, but only after (some time has passed and) a decision was reached. This should be cleared up.Reference

  • 2
    0

    „complaint“ is not defined, and seems to be used synonymous to „counter-notice“ here.
    Also, this paragraph is out-of-chronological-order and seems mostly redundant with the following ones. Should be removed/merged.Reference

  • 0
    0

    „entities that issued the notices“ are „notice providers“, as defined. Use the latter.

    How is the hosting service provider to know the type of notice providers? Are they to do identity verification on them? Does this implicitly disallow anonymous notices?
    At the very least needs „insofar far as known“ added, but preferably should be removed altogether.Reference

  • 0
    0

    This is suddenly not about (claimed) illegal content anymore, but about content moderation in general. Has no place here, remove.Reference

  • 0
    0

    NO! NO! NO!
    Stop mandating these alternate-justice systems.Reference

  • 0
    0

    „Member State in which the notice has been filed“ is not defined.
    If a resident of A, while being in B, electronically submits a notice to content hosting provider incorporated in C, whose facilities in D receive and handle the notice, where has the notice been filed?

    „complaints“ is not defined.

    „As regards jurisdiction, …“ followed by a rule about non-jurisdictional alternate courts. Seriously, just drop Articles 10+11.Reference

  • 3
    0

    Formal requirements for valid notice are an essential part of the legislative framework. The information that defines the valid notice sets the whole mechanism in motion. This is currently missing in the draft. „Any communication about illegal content“ is simply any notice that is submitted by private or public parties.Reference

  • 0
    0

    Just a suggestion: I would probably change „notice provider“ to simple „notifier“.Reference

  • 1
    0

    Is this meant to be cumulative or not? If it is cumulative, then I’m not sure what the additional criteria (selects, stores, references) bring. If not, I’m not sure that referencing publicly available content is tantamount to hosting it (although it can be). Hosts don’t generally distribute (broadcast) content. What is wrong with the description in ECD 14.1 that needs to be fixed?Reference

  • 1
    0

    I’m not sure what „concept“ or „expression“ are adding to, say „any information in any [digital] format such as…Reference

  • 0
    0

    This should also allow some nuance between „illegal content“ (CSAM) and content which has been posted illegally (personal data, for example).Reference

  • 2
    0

    Content provider is a user that provides content – so the uploaded. your definition describes any user of the platform. This should be changed.Reference

  • 2
    0

    Alternative suggestion:
    The process of assessing the[il]legality or potential unacceptability of third-party content, in order to decide whether certain content posted, or attempted to be posted, online should be demoted (i.e. left online but rendered less accessible), demonitised, left online or removed, for some or all audiences, by the service on which it was posted.Reference

  • 0
    0

    the grounds for alleging that the content is illegalReference

  • 0
    0

    „shall be ensure, except in cases of alleged violations of personality rights or of intellectual property rights.“Reference

  • 0
    0

    I’m not sure what „or for their legality assessment“ is supposed to mean. Does it mean that there shouldn’t be a requirement for the provider to assess the legality of the content?Reference

  • 0
    0

    … if they have provided their contact detailsReference

  • 0
    0

    This needs to be a bit more granular – I’m not sure an accusation of a criminal act (CSAM) can be treated the same as a minor offence (copyright)Reference

  • 0
    0

    This section needs a provision saying:
    NN. Content should be left online prior to any decision being taken, unless there are compelling grounds to remove it.Reference

  • 0
    0

    I would add the following: A clearly formulated reason for a complaint accompanied with the legal basis for the assessment of the contentReference

  • 1
    0

    i. Information on what data was accessed by law enforcement authorities as a result of investigations into illegal content removed by the hosting provider.Reference

  • 1
    0

    This should not prevent recourse to justice if any of the parties prefer this option.Reference

  • 0
    0

    I would separate these into two points, as they’re not related to each otherReference

  • 0
    0

    I must admit that I am not quite sure why content moderation is included in the scope of this proposal. Platforms should not encouraged to active look for illegal content and content moderation is more linked to the content that violates their terms of service => which should not be in the scope of the law at first place.Reference

  • 1
    0

    Plus one on changing „concerned“ to „alleged“Reference

  • 0
    0

    What do you mean by „follow up to that notice“? It seems that you leave the final decision and balancing of rights exercise in hands of hosting providers which especially in the context of illegal content is something we are not able to support. Private companies cannot be in charge of a matter that should be taken care of by public authorities. Member States cannot escape legal responsibility by delegating a function or a service to a private actor. By introducing a broad obligation upon private actors to apply proactive measures to assess and potentially remove content and at the same time making them fully responsible for potential interference with fundamental rights raises issues of compatibility with the positive obligations of the state under the EU Charter.Reference

  • 1
    0

    Nowhere in the text is specified who can submit valid notice to hosting service providers. And this is directly linked to the requirement of actual knowledge, which is also omitted by the text. In order to safeguard the principle of legal certainty, you need to clarify what constitutes actual knowledge about illegal content being hosted by online platforms. The legal framework should provide for minimum standards when this type of knowledge is obtained by platforms. The court order issued by an independent judicial body should always constitute actual knowledge. Platforms’ failure to comply with the court order should result in the loss of liability exemption.Reference

  • 0
    0

    Why is terms of service relevant for illegal content? Notice and action procedures established by the legislative framework should only tackle illegal content, given that the assessment of content’s legality is done by judicial independent authority (at least in the ideal world). I would keep ToS and content moderation outside the scope of this proposal.Reference

  • 1
    0

    Alternative list of formal requirements for valid notice:

    A clearly formulated reason for a complaint accompanied with the legal basis for the assessment of the content;

    Exact location of the content that can be determined by the URL link;
    Evidence that substantiates the claim submitted by a notifier;

    Identity of a notifier only if it is necessary for further investigation of the claim and in full compliance with existing legal standards. In general, notifiers should not be forced to disclose their identity when reporting content.Reference

  • 1
    0

    I’m interested to know how this proposal would interact with measures that companies are supposed to take to counter harmful content, since this definition would cover harmful content as well, and in general notice and action mechanisms – based on companies Terms of Services – would also apply to such content. The IMCO report suggests that “measures to combat harmful content should be regularly evaluated and developed further”. Now, how is this evaluation going to happen if the really good transparency measures that are outlined in article 9 don’t apply to these measures?
    Do you envisage to have similar safeguards in the ex ante part of the DSA, where new obligations would apply for dominant/gatekeeper or systemic platforms?Reference

  • 0
    0

    How can we ensure that this safeguard also applies to automated content moderation tools that are applied to harmful content (See also comment re: def of content moderation). Perhaps good to clarify both things in a recital?Reference

  • 0
    0

    Specified time frames and defined procedures should be specified by N/A measures such as: the time to forward the notification to the content provider; the time for the content provider to respond with a counter-notification; the time to make a decision about removing content or maintaining it online;
    the time to inform the involved parties about the decision taken; and
    the time to initiate a review of the decision by the courts.

    Moreover, we advocate for counter-notices submitted prior to any action taken by platforms against the notified content.Reference

  • 1
    0

    This sentence does not specify any concrete obligation for paltfroms and is too vague. If you want to keep it, this should be the opening sentence of this Article.Reference

  • 0
    0

    I thought it was ambiguous on this point, but perhaps given the definition of „notice“ it should be read to mean only notices of allegedly illegal content. So at minimum it should be clarified, and then personally I would clarify it to include TOS violations, as Joe discusses. I would also include some proportionality wiggle room for smaller platforms.Reference

  • 0
    0

    And the number of removal requests delivered by governments using some mechanism other than a judicial procedure! IRUs should be captured here, as should be other back-room government communications. At least aspirationally…
    To be honest I think getting transparency from specific agencies, regulators, or competent authorities (as in TERREG) that regularly do work touching platform content moderation would be less work, and more value, than trying to gather this info from judicial systems. But perhaps I underestimate how efficient the existing tracking of judicial rulings is.Reference

  • 0
    0

    This is extremely sweeping. I would want to understand the details much more. z. B, which data sets is this talking about? Does it mean platforms save copies of every piece of removed (or challenged but not removed) content for potential future audits? If so, and if the GDPR issues can be resolved, that’s fascinating and let’s build other transparency possibilities on top of it! If not, what exactly are these authorities reviewing, and is the benefit worth the cost to all concerned? Etc.Reference

  • 0
    0

    Does „may decide not to act“ mean „may decide to leave down“? I can see situations where that might be appropriate (like if the content violates the TOS anyway) but meaning here is unclear, to me.Reference

  • 0
    0

    Notice providers should be furnished with a receipt from the posting service provider that is referenced by an unique identifier code, authenticated by an electronic certificate. This receipt must include the full content of the elements of the the notice, the contact details -if they have been provided- of the notice provider, the time and the date when the notice was sent.Reference

  • 0
    0

    The requirement of a gender balance is a politically charged, paternalist objective. The composition of the dispute settlement body should be based on merit and skills.Reference

  • 0
    0

    I also agree with previous comments on replacing „concerned“ with „alleged“Reference

  • 0
    0

    Replace with “an explanation of the grounds for allegation that the disputed content is illegal”Reference

  • 0
    0

    Add para.3. „The first and second paragraph of this article shall only apply if the notice provider has supplied sufficient contact details to the hosting service provider.“
    It will bring Article 5 in line with para. 2 of Article 4Reference

  • 0
    0

    Consider replacing with “Upon the decision to act or not on the basis of a notice, the hosting service provider informs notice provider about the decision taken and the reasoning behind the decision, how such decision was made, if the decision was made by a human only or aided by an automated content moderation tools, and how to appeal a decision by either party with the hosting service provider, courts or other entities”.Reference

  • 0
    0

    Consider adding “granted such details were provided” to bring this para. in line with para. 2 of Article 4Reference

  • 0
    0

    We have generally understood content moderation as the way in which platforms address requests for takedowns and other measures (e.g. labelling) under their Terms of Service. If a company has to comply with legal requirements, we tend to think of it as online content regulation/ compliance with legal requirements.Reference

  • 0
    0

    In the ECD, hosting is a function. In practice it’s been applied to a range of services, including search, though search has also been found to fall under the ‚mere conduit‘ exemption. Defining new categories besides those already existing in the ECD is likely to be fraught with difficulties. Before doing so, we would recommend considering what a new definition would add to the existing ones and what the implications are for liability. Also worth considering: the extent to which the ECD should be amended to codify the case-law of the CJEU on this point.Reference

  • 1
    0

    „any communication“ sounds very broad: presumably a ‘valid’ notice would have to fulfil certain requirements? As noted by Access Now above, those requirements should be an essential part of any notice and action framework.

    Also, at this point, the notice can only be about „allegedly“ illegal content, unless the notice comes from a court.

    „removal or disabling of access“: this assumes that these are the only options available. If legal but harmful content is included (which we do not recommend), labelling or other measures might be a better solutionReference

  • 0
    0

    We agree with Access Now’s comments above. We have set out the types of requirements that should be applicable in a notice-and-notice system here: https://www.article19.org/wp-content/uploads/2020/04/ARTICLE-19s-Recommendations-for-the-EU-Digital-Services-Act-FINAL.pdf

    Identification of the notice provider may not be justified if personality rights include e.g. harassment claims. Identification is more likely to be justified in relation to defamation and intellectual property rights claims.Reference

  • 0
    0

    One of the key problems with Article 14 of the E-Commerce Directive is that there’s been a lot of legal uncertainty about what kind of notice is sufficient for providers to obtain „actual knowledge“ of illegality and therefore trigger the loss of *immunity* from liability. In ARTICLE 19’s view, actual knowledge of *illegality* can only be obtained by a court order.

    In practice, the lack of clarity on what constitutes sufficient notice has led a lot of providers to simply remove content upon notice since they do not want to take the risk of being found liable if the case goes to court. Under the ECD, providers do not automatically lose immunity from liability when they receive notice from a third party and do nothing. But doing nothing increases the risk that they might lose it and be found liable if the notifier decides to go to court.

    Going back to this proposal, creating a clearer complaints mechanism is a positive step forward. That (valid?) notices do not automatically trigger the loss of immunity is also positive.

    At the same time, this sytem does not really provide a lot of legal certainty for providers and is ultimately more likely to result in a notice-and-takedown system, which is problematic for freedom of expression.Reference

  • 0
    0

    Under Article 6 (1) of this proposal, the company makes a decision about legality and does so without hearing first from the content provider. This is problematic for two reasons:

    First, as we’ve mentioned before, we don’t think that private companies should decide whether content is legal or not. Even if this is just meant as an assessment of the company’s own liability risk, small companies are unlikely to have teams of lawyers who are able to make this assessment, they are more likely to remove content upon receipt of notice.

    Secondly, the decision is made without the content provider having been given an opportunity to contest the allegation made against him or her. It is therefore inconsistent with due process standards.

    We agree with Access’s comment above that counter-notice should at least take place before a decision is made.Reference

  • 0
    0

    In this proposal, a counter-notice does not seem too different from a notice of appeal. As noted above, we believe that counter-notices should at least take place before a decision is made, though our strong preference is for counter-notices to be used as a means for the content provider and the notifier to resolve the dispute between themselves rather than the intermediary making a decision (Cf: https://www.article19.org/wp-content/uploads/2020/04/ARTICLE-19s-Recommendations-for-the-EU-Digital-Services-Act-FINAL.pdf)Reference

  • 0
    0

    I know this is opening up Pandora’s box a little bit, but there should ideally be some frame for the decisions hosting providers take.
    1. If a hosting provider doesn’t act, they lose their liability protections. So ther eis an incentive to act.
    2. If a hosting provider doesn’t remove allegedly infringing content, they will likely be sued by the rightsholders. So there is an incentive to remove content.
    3. If the hosting provider removes legal content, users are much less likely to sue. So there is a much weaker incentive to not overblock.

    Not a ready baked solution, but some thoughts:
    One way to overcome this is to have organisations that regularly send fraudulent notices be banned from the N&A system for a period of time. But that is on the platform side. Platforms themselves could risk losing their liability protections if they systemically overblock.Reference

  • 1
    0

    Suggest discussing the concept of information society services. The ECD explains that it relates to services normally provided for remuneration, a relatively broad concept. On the one hand, a broad understanding (as supported by legal doctrine) benefits an entire range of service providers, which fall under the freedom of establishment and limited liability provisions. On the other hand, it could be to the disadvantage of those providers, who do not necessarily seek profit or receive benefits other than monetary remuneration.Reference

  • 0
    0

    Agree with Art 19. We would also like to add that the Art 14 requires a hosting provider to act as intermediary service provider (to benefit from the liability exemptions), which is a concept that requires clarification.Reference

  • 0
    0

    It’s very important to make clear that intermediaries should not be held liable for choosing not to remove content simply because they received a private notification by a user (as you do under para 3). In this sense, it may be reasonable to amend the phrase “take an informed decision about the follow-up to that notice” and replace it with “to inform the hosting service provider about potentially illegal content and behavior”. The alternative is to link it to the notion of „to act or not“ in Art 5(2), but this would entail a significant risk to burden online intermediaries with legal assessments. It consequently leads to problems related to liability for non-acting (as the assessment can lead to positive knowledge about the (non)illegality of user hosted content).Reference

  • 0
    0

    Beware of the pitfalls of good faith clauses for individual users. What is the consequence of not acting in good faith? And good faith about what exactly? Individual users can hardly assess legal concepts and the nature of infringements. We suggest reformulating this phrase to avoid the negative consequences of existing mis-guided co-regulatory approaches. For example, the NetzDG has compelled certain platforms to threaten users with the disabling of their accounts in case that notifications turn out to be legally inaccurate.Reference

  • 0
    0

    We support that notices should not automatically trigger legal liability (nor should they impose any removal requirements). However, you cannot escape the problem of liability for user content without addressing the situations under which liability is actually triggered. In order to protect freedom of expression and access to justice, we recommend to work with the general principle that actual knowledge of illegality is only obtained by intermediaries if they are presented with a court order.Reference

  • 0
    0

    Suggest clarifying all three concepts used in the draft Regulation: 1) Information Society Service 2) Online Intermediary Service 3) Hosting Service.Reference

  • 0
    0

    +1 to noyb’s comment. if you wish to regulate local competences, you will need to use the termini of domicile/habitual residence (or other connecting factors). also, the last sentence is not in line with the BXL Ia Regulation (speaking about domicile rather than residence) and leads to different competent bodies depending on whether the individual acts or whether a non-profit body acts on behalf of that individual (being a non-natural person). Notably, the ADR directive does not regulate these aspects (which gives more options to the affected individual to act thanks to potential multiple competent bodies).Reference

Prosím sdílejte!

Scroll to Top