European Parliament Approves Amendments to Draft “Terrorist Content” Legislation
The European Parliament approved several amendments to the European Commission's proposed Regulation on preventing the dissemination of terrorist content online on April 17, 2019. The Regulation requires, among other things, that "hosting service providers" remove terrorist content from their platforms within one hour of receiving a removal order. Service providers could be fined up to four percent of their annual revenue for systematic failure to remove content.
The amendments narrow the scope of the types of content covered, limit the applicability to service providers that disseminate content publicly, and provide that proactive detection measures are voluntary, not required. Notably, however, the European Parliament rejected an amendment that would have allowed companies more time to address removal orders.
- Narrowing the definition of "terrorist content." The amendment narrows the proposed definition of "terrorist content" to "inciting, encouraging, promoting or glorifying in any way" the offenses described in the 2017 directive on combating terrorism. It also clarifies that the Regulation does not apply to content "disseminated for educational, artistic, journalistic or research purposes, or for awareness raising purposes against terrorist activity," and that the expression of polemic or controversial views on sensitive political questions should not be considered terrorist content. (The prior definition was broader, and included inciting or advocating, including by glorifying, the commission of terrorist offences, thereby causing a danger that such acts be committed; encouraging the contribution to terrorist offences; promoting the activities of a terrorist group, in particular by encouraging the participation in or support to a terrorist group; or instructing on methods or techniques for the purpose of committing terrorist offences.)
- Narrowing the definition of "hosting service provider." The amendment narrows the definition of "hosting service provider" to service providers that enable dissemination of content to the public as opposed to any third party, as was provided in the commission proposal. The revised definition would also exclude service providers offering infrastructure layers (e.g., cloud infrastructure services and electronic communication services).
- Specifying independent and judicial competent authorities. The amendment adds a definition of "competent authority," which will enforce the regulation, and requires these individuals to be a "single designated judicial authority or functionally independent administrative authority in a Member State."
- Voluntary (not required) proactive measures. The amendment removes the requirement that service providers take proactive measures to protect their services from dissemination of terrorist content. Instead, service providers "may take specific measures to protect their services against the public dissemination of terrorist content." Competent authorities may request that a service provider implement "necessary, proportionate and effective additional specific measures" if the service provider has received a substantial number of removal orders, but these measures should not impose a general monitoring obligation, nor the use of automated tools.
- No referrals. The amendment deletes Article 5 of the commission proposal which would have allowed competent authorities to issue "referrals" under which providers must "assess the content identified in the referral against [their] own terms and conditions and decide whether to remove that content or to disable access to it."
What's Next?
The final text of the Regulation will be negotiated with the Council of Ministers after the May 2019 elections. The Regulation could be adopted if the parliament and Council of Ministers agree on the amendments.
Perkins Coie attorneys are following this proposed Regulation closely and analyzing whether and to what extent it potentially conflicts with U.S. law, including the First Amendment to the U.S. Constitution, the Communications Decency Act and the Electronic Communications Privacy Act.
Clients with questions about the draft Regulation, including how it may affect content moderation and enforcement efforts, should contact experienced counsel.
© 2019 Perkins Coie LLP