Illegal Content

The law is clear; what is illegal offline is illegal online. And a bevy of European laws – such as the directive on combating terrorism (2017) and the directive on combating the sexual abuse and sexual exploitation of children and child pornography (2011) – have been promulgated over the years, requiring platforms to remove illegal content "expeditiously," in the words of the electronic commerce directive (2000), once they are notified or in some other way become aware of its existence.

A short list of offences covered by these rules includes incitement to terrorism, xenophobic hate speech with intent to incite, copyright infringement and child abuse. Few dispute the underlying principles at stake, but the debate grows heated over how – and how quickly – platforms should be legally bound to remove illegal material when it appears. How much redress do platform users have if they believe their content was removed unjustifiably? And how much duty do the platforms have to cooperate pro-actively with security and law enforcement services?

The European Commission has provided a handy framework: a communication on tackling illegal content online (2017) and a recommendation on measures to effectively tackle illegal content online (2018). These guidelines are non-binding, but – faced with a rise in phenomena like terrorism – some governments have stepped in with steeper, more binding rules. Germany is a case in point. Its Netzwerkdurchsetzungsgesetz (NetzDG), threatens large fines if "manifestly unlawful" material does not disappear from platforms within 24 hours of notification. France’s proposed Loi contre les contenus haineux sur Internet (the "Avia law," named for sponsor Laetitia Avia), would do the same.

Put simply, the discussion boils down to several simple determinations: is illegal content coming down quickly enough? Are the rules and codes of conduct strong enough to prevent damage from occurring given the speed with which harm can take place? Is the volume of illegal content decreasing given the measures already in place, and is it decreasing quickly enough? And if stronger measures are needed, how can they be constructed to obtain better results without violating important rights such as privacy, redress and free speech?

The evidence presented in this section cover illegal content broadly. Separate sections will deal with more concrete aspects, such as incitement to terrorism, hate speech and copyright infringement. Additional information on illegal content online can be found on the World Intermediary Liability Map (WILMap), led by the Center for Internet and Society at Stanford Law School.

Records 1 - 10 of 64


chart preview

Accounts Removed by TikTok for Policy Violations, by Type of Reason

This chart shows the total number of accounts removed by Tiktok due to violation of its policy, from July 2020 until December 2021. The data shows that the total number of accounts removed in the last quarter of 2021 was more than five times higher than those removed in the same period of the previous year. Overall, the main reason of removal is the account's user age, with 63.8% account removed in the last quarter of 2021.
chart preview

Acquired or Accessed Any Content Type Illegally (2017)

The chart shows the percent of respondents that use the internet who acquired or accessed any type of content illegaly over the past year. Respondents from Poland and Spain were the most likely to report having done so among European Union countries. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020. As a note, the data in the chart covers exclusively the streamripping and pirated copies on physical carriers.
chart preview

Action Taken After Encountering Illegal Content (2018)

The chart shows that the majority of users took not action after encountering illegal content online. The chart results are based on the answers to the question “Q4. What action did you take after encountering this content?", for which the selection of more than one answer is possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

Action Taken After Encountering Illegal Content (By Country)

The chart shows that most users took not action after encountering illegal content online, although respondents from Germany were the least likely to report having taken no action. The chart results are based on the answers to the question: What action did you take after encountering this content?, " for which the selection of more than one answer is possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

Content Actioned Under Child Nudity and Sexual Exploatation Violations on Facebook

The chart shows the number of content actioned under child nudity and sexual exploatation violations on Facebook, from third quarter of 2018 until the first quarter of 2022. Since the April 2021, the Child Nudity and Sexual Exploitation content was renamed Child Endangerment, and includes two distinct topics - Nudity and Physical Abuse and Sexual Exploitations, which are monitored separately. The data from the second and third quarters of 2021 shows that volume of content actioned for sexual exploitation violations is significantly higher (ten times higher) than the content actioned for nudity and physical abuse. In the first quarter of 2022, the volume of content actioned for sexual exploitation violations decreased by 35.5% compared to the second quarter of 2021, but it remains significantly higher than the content actioned for nudity and physical abuse.
chart preview

Content Actioned Under Dangerous Organisations Violations on Facebook

This chart shows the content actioned under their terrorism and organised hate violations on Facebook, from October 2017 until March 2022. The data shows that the volume of content actioned on under terrorism violations in the first quarter of 2022 almost doubled compared to the same period in the previous year. Content actioned under organised hate violations is more recent (from October 2019). The data shows a slight increase in the first quarter of 2022 compared to the fourth quarter of 2021, but the volume remain significantly lower than the first quarter of 2021.
chart preview

Content Actioned Under Hate Speech Violations on Instagram

The chart shows the number pieces of content actioned under hate speech violations on Instagram over the period October 2019 - March 2022. The data shows a significant increase of hate speech violations that were found and actioned on by Instagram during the monitoring period, with a pick in the second quarter of 2021, when 9.8 million of pieces of content were actioned on. In the first quarter of 2022, the volume of content actioned on is significantly lower (3.4 million of pieces of content).
chart preview

Content Removal Comparison: Google Community Guidelines vs. Germany’s Network Enforcement Act (2018)

The chart presents the distribution of posts removed by Google due to violations of Google's community guidelines and the Germany’s Network Enforcement Act, on the grounds for removal, for the period July - December 2018. The data shows that the majority of removal decisions are based on the platform’s private standards, as they often prioritise the compliance with their community guidelines, and not with German speech laws.
chart preview

Content Removal Comparison: Google Community Guidelines vs. Germany’s Network Enforcement Act (2019-2021)

The chart presents the distribution of items removed by Google since 2019, due to violations of Google's community guidelines and the Germany’s Network Enforcement Act, on the grounds for removal. The data shows that the majority of removal decisions are based on the platform’s private standards, as they often prioritise the compliance with their community guidelines, and not with German speech laws.
chart preview

Content Removal Under the Germany’s Network Enforcement Act (2018-2021)

The chart shows the total number of items removed or blocked by Google, due to violations of the Germany’s Network Enforcement Act, by the type of submitter (users and reporting agencies). The results are based on the data from Google Transparency Report, last accessed on 14 February 2022.