Illegal Content

The law is clear; what is illegal offline is illegal online. And a bevy of European laws – such as the directive on combating terrorism (2017) and the directive on combating the sexual abuse and sexual exploitation of children and child pornography (2011) – have been promulgated over the years, requiring platforms to remove illegal content "expeditiously," in the words of the electronic commerce directive (2000), once they are notified or in some other way become aware of its existence.

A short list of offences covered by these rules includes incitement to terrorism, xenophobic hate speech with intent to incite, copyright infringement and child abuse. Few dispute the underlying principles at stake, but the debate grows heated over how – and how quickly – platforms should be legally bound to remove illegal material when it appears. How much redress do platform users have if they believe their content was removed unjustifiably? And how much duty do the platforms have to cooperate pro-actively with security and law enforcement services?

The European Commission has provided a handy framework: a communication on tackling illegal content online (2017) and a recommendation on measures to effectively tackle illegal content online (2018). These guidelines are non-binding, but – faced with a rise in phenomena like terrorism – some governments have stepped in with steeper, more binding rules. Germany is a case in point. Its Netzwerkdurchsetzungsgesetz (NetzDG), threatens large fines if "manifestly unlawful" material does not disappear from platforms within 24 hours of notification. France’s proposed Loi contre les contenus haineux sur Internet (the "Avia law," named for sponsor Laetitia Avia), would do the same.

Put simply, the discussion boils down to several simple determinations: is illegal content coming down quickly enough? Are the rules and codes of conduct strong enough to prevent damage from occurring given the speed with which harm can take place? Is the volume of illegal content decreasing given the measures already in place, and is it decreasing quickly enough? And if stronger measures are needed, how can they be constructed to obtain better results without violating important rights such as privacy, redress and free speech?

The evidence presented in this section cover illegal content broadly. Separate sections will deal with more concrete aspects, such as incitement to terrorism, hate speech and copyright infringement. Additional information on illegal content online can be found on the World Intermediary Liability Map (WILMap), led by the Center for Internet and Society at Stanford Law School.

Records 51 - 60 of 64


chart preview

Share of Web Addresses Requested to Be Delisted

The chart shows the percentage of web addresses that have been delisted after review out of total requests received. The data cover the period 28 May 2014 to 13 June 2022. Web addresses delisting requests that are still pending review, or that require additional information in order to process, are not included in the graph. The last access date of the live chart is 13 June 2022.
chart preview

Sources and Destinations for European Small and Medium-sized Enterprises Imports and Exports

This chart highlights that the overwhelming majority of European SMEs rely heavily on the European Single Market, with SMEs receiving 81% of their imports from and sending 81% of their exports to other EU Member States.
chart preview

The Legal Environment's Negative Impact on Investing (2014)

In the survey, investors were asked which of four factors had the most negative impact on their investing behavior: the legal environment, the economy, the competitive environment, or the expected return on their investment. The results show that in all eight countries, investors view the legal environment as having the most negative impact, with an average of 89% of investors surveyed saying it had a modest or strongly negative impact, 93% of United States investors feeling this way, and an average of 89% of the European Union investors concurring. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

The Negative Impact on Investors of Regulatory Ambiguity (2014)

The chart shows that a high majority of respondents in every surveyed country consider that an ambiguous regulatory framework makes them uncomfortable investing in digital content intermediaries that offer user-uploaded music or video.
chart preview

The Number of Videos Removed by YouTube, by Source of First Detection

The chart number of videos removed by YouTube for the period October 2017-March 2022, by first source of detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program,which include individuals, NGOs, and government agencies. The chart shows that the number of automated flagging is significantly higher compared to human detection. When it comes to human detection, the biggest number of removed videos were first noticed by users, followed by individual trusted flaggers, NGOs and government agencies.
chart preview

The Top Ten Countries for Hosting Child Sexual Abuse Content

The chart shows the top 10 countries that host web pages with child sexual abuse material, based on the assessment of the Internet Watch Foundation. Interestingly, seven out of 10 countries are in Europe and six out of 10 are in the European Union.
chart preview

Trademark Content Take-Down Notices Received by TikTok

The chart presents the number of trademark take-down requests received by TikTok and the number of succesfull take-downs in 2021. The TikTok Community Guidelines and Terms of Service prohibit content that infringes third party intellectual property. When valid take-down requests based on violations of copyright law and trademark law are received by the platform from the right holders, TikTok removes the alleged infringing content in a timely manner. There is a breakdown in data for the first half of 2021 and the data for the two periods are not directly comparable. In the period January - April 2021, data includes all trademark take-down notices received by the platform, while since May 2021, the data includes only valid trademark take-down notices. Valid trademark take-down notices are notices that provide sufficient information to assess if there has been trademark infringement according to applicable law.
chart preview

Videos Removed by TikTok for Policy Violations, by Type of Detection

This chart shows the volume of videos removed by Tiktok for policy violation, from July 2020 until December 2021. The data shows that the volume of videos removed by TikTok in the last quarter of 2021 increased by 86% compared to the same period of the previous year. However, the total videos removed by TikTok represents about 1% of all videos uploaded on the social media platform.
chart preview

Videos Removed by TikTok for Policy Violations, by Type of Policy

This chart shows the volume of videos removed by Tiktok for policy violation, from July 2020 until December 2021. A video may violate multiple policies and each violation is reflected. In certain rare circumstances, such as emergency situations or hardware outages, a removed video’s violation category may not be captured. These videos are not represented in the chart. The data shows that minors' safety is the main reason for video removal by TikTok (45% in the last quarter of 2021), follwed by illegal activities and regulated goods (19.5%) and adult nudity and sexual activities (11%).
chart preview

Videos Removed by YouTube, by Removal Reason

This chart shows the distribution of videos removed by YouTube, by the reason removal, over the period September 2018-March 2022. These removal reasons correspond to YouTube’s Community Guidelines. Reviewers evaluate flagged videos against all of YouTube's Community Guidelines and policies, regardless of the reason the video was originally flagged. As the chart shows, the most frequent reasons of removal of videos are child abusive content, violent or graphic content and nudity or sexual content. In the first quarter of 2022, the child safety content decline by 53.5% compared to the same period of 2021, while harmful or dangerous content increased in the same period by 463% and harassement and cyberbullying by 579%.