Illegal Content

The law is clear; what is illegal offline is illegal online. And a bevy of European laws – such as the directive on combating terrorism (2017) and the directive on combating the sexual abuse and sexual exploitation of children and child pornography (2011) – have been promulgated over the years, requiring platforms to remove illegal content "expeditiously," in the words of the electronic commerce directive (2000), once they are notified or in some other way become aware of its existence.

A short list of offences covered by these rules includes incitement to terrorism, xenophobic hate speech with intent to incite, copyright infringement and child abuse. Few dispute the underlying principles at stake, but the debate grows heated over how – and how quickly – platforms should be legally bound to remove illegal material when it appears. How much redress do platform users have if they believe their content was removed unjustifiably? And how much duty do the platforms have to cooperate pro-actively with security and law enforcement services?

The European Commission has provided a handy framework: a communication on tackling illegal content online (2017) and a recommendation on measures to effectively tackle illegal content online (2018). These guidelines are non-binding, but – faced with a rise in phenomena like terrorism – some governments have stepped in with steeper, more binding rules. Germany is a case in point. Its Netzwerkdurchsetzungsgesetz (NetzDG), threatens large fines if "manifestly unlawful" material does not disappear from platforms within 24 hours of notification. France’s proposed Loi contre les contenus haineux sur Internet (the "Avia law," named for sponsor Laetitia Avia), would do the same.

Put simply, the discussion boils down to several simple determinations: is illegal content coming down quickly enough? Are the rules and codes of conduct strong enough to prevent damage from occurring given the speed with which harm can take place? Is the volume of illegal content decreasing given the measures already in place, and is it decreasing quickly enough? And if stronger measures are needed, how can they be constructed to obtain better results without violating important rights such as privacy, redress and free speech?

The evidence presented in this section cover illegal content broadly. Separate sections will deal with more concrete aspects, such as incitement to terrorism, hate speech and copyright infringement. Additional information on illegal content online can be found on the World Intermediary Liability Map (WILMap), led by the Center for Internet and Society at Stanford Law School.

Records 61 - 64 of 64


chart preview

Videos Removed by YouTube, by Source of First Detection

The chart shows the percentage of videos removed by YouTube for the period October 2017-March 2022, by first source of detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that automated flagging is by far the first source of detection compared to human detection.
chart preview

Videos Removed by YouTube, by Source of First Detection (Human)

The chart shows the number of videos removed by YouTube for the period October 2017-March 2022, by first source of detection (human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that the highest number of removed videos were first noticed by users (12,468,976 videos), followed by individual trusted flaggers (4,614,456 videos), NGOs (181,430 videos) and government agencies (755 videos).
chart preview

What Happened to Reported Content (2018)

The chart shows that 45% of respondents who took action after encountering illegal content online reported that the content was taken down, but 20% reported that it was kept online without changes. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

What Happened to Reported Content Across European Union Member States (2018)

The chart shows that, among respondents who took action after encountering illegal content online, respondents from Hungary were the most likely to report that the content was taken down, while respondents from Estonia were the least likely to do so. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.