
A journal of the Nordic Counter Terrorism Network (NCTN)
and the International Association for Counter Terrorism and Security Professionals (IACSP)
The European regulation on terrorist content online, which came into force in June 2022, is gradually being implemented. As part of the European project Tech Against Terrorism Europe (TaTE), let’s examine six key aspects of this regulation, which aims to improve the detection of terrorist content at a supranational level.
Terrorism remains a serious threat to EU member states. According to Europol’s latest report on terrorism trends in the EU (published in 2023), 28 attacks — carried out, missed, or foiled — were recorded in the EU in 2022. Additionally, 380 people were arrested for offenses linked to terrorism.
New technologies and the Internet continue to be widely used to spread terrorist ideologies, recruit new members, and radicalize individuals. While social networks, forums, and online video game platforms remain very popular, terrorist groups are also increasingly using decentralized platforms, which are often less protected against terrorist uses. When the EU presented the draft regulation in 2018, it had already identified 150 platforms used for terrorist purposes (IOCTA 2017). We present six key aspects of this regulation below.
1) Hosting Service Providers (HSPs) involved
The regulation applies to hosting service providers (HSPs) who: 1) store data, and 2) disseminate it to the public, i.e., without access to this information requiring registration or validated admission manually. If admission is validated automatically, without human intervention, the content is considered to be disseminated to the public. This includes, for example, social media platforms, video, image, and sound sharing services, as well as file-sharing and other cloud services—if the stored information is made available to the public. The third criterion: the dissemination must be done at the direct request of the content provider. Therefore, cloud services are not considered hosting service providers when offering their services to other service providers because they do not store content at the direct request of the content provider.
2) Hosting service providers worldwide
The answer is no, but almost, potentially. An HSP is concerned if they have a substantial connection with an EU country. A connection with the EU is considered substantial when:
– The HSP is established in the Union.
– Its services are used by a significant number of users in one or more Member States (at present, there is no official indication of what constitutes a significant number).
– Its activities are intended for one or more Member States.
One of these three criteria is sufficient to consider that a substantial connection exists.
3) Diverse impact on Hosting Service Providers (HSPs)
The regulation distinguishes between HSPs in general and HSPs “exposed to terrorist content”. HSPs exposed to terrorist content are those who have received at least two definitive removal orders in the last 12 months.
All HSPs are legally required only to have a point of contact and a legal representative. HSPs exposed to terrorist content are required to take specific measures to protect their services against the dissemination of terrorist content. Among the possible measures is the establishment of mechanisms for reporting suspicious content. Using automatic processes is not required.
4) The key measure: removal orders
Removal orders are the flagship measure of the regulation. If an HSP receives a removal order, it means that the content in question has been deemed terrorist by a competent authority. Three Member States (Poland, Portugal, and Slovenia) have yet to designate their competent authorities.
The HSP is obliged to remove the content in question immediately, within one hour of receiving the removal order. Removals can be done in three technical ways:
– The content is removed (deleted) from the site.
– Access to the content is blocked/disabled.
– The content is geo-blocked in the European Union.
The HSP must immediately inform the competent authority (see Annex 2 of the Regulation). It must also notify the content provider and retain the removed or blocked terrorist content for six months to allow involved actors to appeal and contest the removal order. The competent authority or court may, if necessary, order that the content be stored for a longer period (for example, if legal proceedings are pending).
5) Removal orders and potential dangers for fundamental rights
Many observers opposed the regulation for three main reasons. Firstly, the one-hour delay in applying the removal order may push HSPs to resort to automated processes. Some fear that these tools remove content inaccurately and imprecisely (even more than human moderation) and further accentuate the lack of transparency in moderation by platforms, large or small (see e.g., Bouko, Van Ostaeyen, and Voué 2021). As a lot of research has shown, terrorist actors use multiple strategies to circumvent content moderation (see e.g., Bouko, Van Ostaeyen, and Voué 2021b; Bouko et al. 2021). However, the EU provides for the possibility of not responding to the one-hour deadline in the event of force majeure, including technical reasons.
The second opposition concerns the designation of the competent authorities. The regulation requires Member States to designate the bodies empowered to implement it. Although these bodies must be objective, non-discriminatory, and respectful of rights, signatories of an open letter to the European Parliament (EDRI 2017) argue that only courts or independent administrative authorities should have the authority to implement the regulation and, in particular, to issue removal orders. However, the regulation merely indicates that Member States must determine the “administrative, repressive, or judicial nature” of the selected competent authorities.
Therefore, the signatories emphasize that the possible absence of judicial oversight represents a significant risk to freedom of expression and access to information. In practice, the nature of these authorities is relatively heterogeneous: security services (e.g., Estonia and Latvia), police (e.g., Germany, Cyprus, and Ireland), interior ministries (e.g., Bulgaria, Croatia, and Spain), public prosecutor’s offices (e.g., Belgium), and courts (e.g., Denmark, France, and Greece).
Some countries, such as France, have designated several authorities depending on the missions to be accomplished: a) issuing removal orders, b) carrying out an in-depth examination of removal orders, c) supervising the implementation of specific measures, and d) imposing sanctions (see Article 12). This heterogeneity in the selection of competent authorities raises fears of fragmentation in the application of the regulation within the EU. Furthermore, the police or security services may not have the legal knowledge to determine whether a removal order infringes on fundamental freedoms. For some, these prerogatives should therefore be left to authorities with expertise in legal issues (Gherbaoui and Scheinin 2023).
Thirdly, the signatories of this same open letter were concerned that any competent authority would have the power to demand the immediate removal of any online content, hosted on servers located in any country of the European Union. In this case, a Member State would be authorized to expand its sphere of intervention beyond its territorial borders without having to first obtain judicial authorization, without taking into account the rights of individuals in the jurisdictions concerned. However, the regulation includes a specific procedure for cross-border removal orders. When sending the removal order, the competent authority must send a copy of the order to the originating authority (i.e., the HSP’s country authority). The latter can verify the removal order within 72 hours.
The HSP (as well as the content provider) may submit a request for review of the removal order to the originating authority. If the removal order is found to infringe the EU Charter of Fundamental Rights, the originating authority may take a reasoned decision against the order.
6) EU means to guarantee respect for fundamental rights
The right to an effective remedy is fundamental to the regulation. The regulation provides a right of appeal against removal orders for HSPs and content providers. It also includes a right of complaint for providers of content that has been removed under “specific measures,” regardless of removal orders. Furthermore, if an HSP has taken measures against the dissemination of terrorist content during a calendar year, it must publish a transparency report on the measures taken during that year no later than March 1 of the following year.
Note: Tech Against Terrorism Europe (TaTE) is an EU-funded project that aims to inform and support HSPs in complying with EU regulations on terrorist content online. TaTE offers, in particular, a guide and an online inter-university course on the regulation (duration: 3 hours), both available free of charge at tate.techagainstterrorism.org.
References:
Bouko, Catherine, Brigitte Naderer, Diana Rieger, Pieter Van Ostaeyen, and Pierre Voué. 2021. “Discourse Patterns Used by Extremist Salafists on Facebook: Identifying Potential Triggers to Cognitive Biases in Radicalized Content”. Critical Discourse Studies 19 (3): 1–22. https://doi.org/10.1080/17405904.2021.1879185.
Bouko, Catherine, Pieter Van Ostaeyen, and Pierre Voué. 2021a. “Facebook’s Policies against Extremism: Ten Years of Struggle for More Transparency”. First Monday 26 (9). https://doi.org/10.5210/fm.v26i9.11705.
2021b. “How Jihadi Salafists Sometimes Breach, But Mostly Circumvent, Facebook’s Community Standards in Crisis, Identity and Solution Frames”. Studies in Conflict & Terrorism 47 (4): 336–91. https://doi.org/10.1080/1057610X.2021.1963092.
EDRI. 2021. Open Letter to the European Parliament. https://edri.org/wpcontent/uploads/2021/03/MEP_TERREG_Letter_EN.pdf
EUROPOL. 2023. Rapport 2023 sur la situation et les tendances du terrorisme dans l’Union européenne (rapport te-sat). https://www.europol.europa.eu/cms/sites/default/files/documents/TESAT%202023%20-%20Synthe%CC%80se%20.pdf
Gherbaoui, Tarik and Scheinin, Martin. 2023. “A Dual Challenge to Human Rights Law: Online Terrorist Content and Governmental Orders to Remove it”. Journal européen des droits de l’homme – European Journal of Human Rights 1, 3-29. https://ssrn.com/abstract=4247120
IOCTA. 2017. Internet Organised Crime Threat Assessment. https://www.europol.europa.eu/sites/default/files/documents/iocta2017.pdf
Vergada, Ingrid. 2018. Modération des contenus terroristes : l’Europe va mettre les plateformes à l’amende. Le Figaro. https://www.lefigaro.fr/secteur/high-tech/2018/09/12/32001-20180912ARTFIG00160-moderation-des-contenus-terroristes-l-europe-va-mettre-lesplateformes-a-l-amende.php
