Illegal Content

The law is clear; what is illegal offline is illegal online. And a bevy of European laws – such as the directive on combating terrorism (2017) and the directive on combating the sexual abuse and sexual exploitation of children and child pornography (2011) – have been promulgated over the years, requiring platforms to remove illegal content "expeditiously," in the words of the electronic commerce directive (2000), once they are notified or in some other way become aware of its existence.

A short list of offences covered by these rules includes incitement to terrorism, xenophobic hate speech with intent to incite, copyright infringement and child abuse. Few dispute the underlying principles at stake, but the debate grows heated over how – and how quickly – platforms should be legally bound to remove illegal material when it appears. How much redress do platform users have if they believe their content was removed unjustifiably? And how much duty do the platforms have to cooperate pro-actively with security and law enforcement services?

The European Commission has provided a handy framework: a communication on tackling illegal content online (2017) and a recommendation on measures to effectively tackle illegal content online (2018). These guidelines are non-binding, but – faced with a rise in phenomena like terrorism – some governments have stepped in with steeper, more binding rules. Germany is a case in point. Its Netzwerkdurchsetzungsgesetz (NetzDG), threatens large fines if "manifestly unlawful" material does not disappear from platforms within 24 hours of notification. France’s proposed Loi contre les contenus haineux sur Internet (the "Avia law," named for sponsor Laetitia Avia), would do the same.

Put simply, the discussion boils down to several simple determinations: is illegal content coming down quickly enough? Are the rules and codes of conduct strong enough to prevent damage from occurring given the speed with which harm can take place? Is the volume of illegal content decreasing given the measures already in place, and is it decreasing quickly enough? And if stronger measures are needed, how can they be constructed to obtain better results without violating important rights such as privacy, redress and free speech?

The evidence presented in this section cover illegal content broadly. Separate sections will deal with more concrete aspects, such as incitement to terrorism, hate speech and copyright infringement. Additional information on illegal content online can be found on the World Intermediary Liability Map (WILMap), led by the Center for Internet and Society at Stanford Law School.

Records 1 - 10 of 50


chart preview

Acquired or Accessed Any Content Type Illegally (2017)

The chart shows the percent of respondents that use the internet who acquired or accessed any type of content illegaly over the past year. Respondents from Poland and Spain were the most likely to report having done so among European Union countries. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020. As a note, the data in the chart covers exclusively the streamripping and pirated copies on physical carriers.
chart preview

Action Taken After Encountering Illegal Content (2018)

The chart shows that the majority of users took not action after encountering illegal content online. The chart results are based on the answers to the question “Q4. What action did you take after encountering this content?", for which the selection of more than one answer is possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

Action Taken After Encountering Illegal Content (By Country)

The chart shows that most users took not action after encountering illegal content online, although respondents from Germany were the least likely to report having taken no action. The chart results are based on the answers to the question: What action did you take after encountering this content?, " for which the selection of more than one answer is possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

Content Actioned Under Child Nudity and Sexual Exploatation Violations on Facebook

The chart shows the number of content actioned under child nudity and sexual exploatation violations on Facebook. The data shows that after the pick registered in the third trimester, the volume of content actioned on decreased by 56% in the last trimester of 2020.
chart preview

Content Actioned Under Dangerous Organisations Violations on Facebook

This chart shows the content actioned under their terrorism and organised hate violations on Facebook. The data shows slighly higher volume of content actioned on under terrorism violations in 2020 compared to previous years. Content actioned under organised hate violations is more recent, but it shows significant increase during 2020.
chart preview

Content Actioned Under Hate Speech Violations on Instagram

The chart shows the number pieces of content actioned under hate speech violations on Instagram. The data shows a significant increase of hate speech violations that were found and actioned on by Instagram in the last two quarters of 2020.
chart preview

Content Removal Comparison: Google Community Guidelines vs. Germany’s Network Enforcement Act (2018)

The chart presents the distribution of posts removed by Google due to violations of Google's community guidelines and the Germany’s Network Enforcement Act, on the grounds for removal, for the period July - December 2018. The data shows that the majority of removal decisions are based on the platform’s private standards, as they often prioritise the compliance with their community guidelines, and not with German speech laws.
chart preview

Content Removed by Twitter for Policy Violations (2019-2020)

The chart shows the amount of content removed by Twitter due to Twitter policy violations. The data shows 33% decline of the amount of content removed in the period January-June 2020 compared to the previous period.
chart preview

Distribution of Content Actioned Under Other Types of Violation on Facebook (2017-2020)

The chart shows the distribution of the content actioned under other types of violations on Facebook, from the fourh quarter of 2017 until the end of 2020. The figure excludes the content removed under fake accounts and spam content violations. The data shows that the adult nudity and sexual activity remains the main reason of removal on Facebook, followed closely by hate speech and violent and graphic content.
chart preview

Distribution of Content Creators on Social Media Platforms

A recent report on the “creator economy” by Yuanling Yuan, senior associate at SignalFire, shows that there are over 50 million creators on Youtube, Instagram, Twitch, TikTok, and other social media platforms. The chart presents the distribution of these content creators by social media platforms and professional status, with approximately two million full-time creators that earn six figure salaries by creating content daily or weekly. And that massive distributed content creation engine means that about 90% of the video, audio, photo, and text-based content consumed today by Gen Z is created by individuals, not corporations.