Illegal Content

The law is clear; what is illegal offline is illegal online. And a bevy of European laws – such as the directive on combating terrorism (2017) and the directive on combating the sexual abuse and sexual exploitation of children and child pornography (2011) – have been promulgated over the years, requiring platforms to remove illegal content "expeditiously," in the words of the electronic commerce directive (2000), once they are notified or in some other way become aware of its existence.

A short list of offences covered by these rules includes incitement to terrorism, xenophobic hate speech with intent to incite, copyright infringement and child abuse. Few dispute the underlying principles at stake, but the debate grows heated over how – and how quickly – platforms should be legally bound to remove illegal material when it appears. How much redress do platform users have if they believe their content was removed unjustifiably? And how much duty do the platforms have to cooperate pro-actively with security and law enforcement services?

The European Commission has provided a handy framework: a communication on tackling illegal content online (2017) and a recommendation on measures to effectively tackle illegal content online (2018). These guidelines are non-binding, but – faced with a rise in phenomena like terrorism – some governments have stepped in with steeper, more binding rules. Germany is a case in point. Its Netzwerkdurchsetzungsgesetz (NetzDG), threatens large fines if "manifestly unlawful" material does not disappear from platforms within 24 hours of notification. France’s proposed Loi contre les contenus haineux sur Internet (the "Avia law," named for sponsor Laetitia Avia), would do the same.

Put simply, the discussion boils down to several simple determinations: is illegal content coming down quickly enough? Are the rules and codes of conduct strong enough to prevent damage from occurring given the speed with which harm can take place? Is the volume of illegal content decreasing given the measures already in place, and is it decreasing quickly enough? And if stronger measures are needed, how can they be constructed to obtain better results without violating important rights such as privacy, redress and free speech?

The evidence presented in this section cover illegal content broadly. Separate sections will deal with more concrete aspects, such as incitement to terrorism, hate speech and copyright infringement. Additional information on illegal content online can be found on the World Intermediary Liability Map (WILMap), led by the Center for Internet and Society at Stanford Law School.

Records 21 - 30 of 64


chart preview

Government Requests Addressed to Google to Remove Content by Type of Reason

Governments contact Google with content removal requests for a number of reasons. Government bodies may claim that content violates a local law, and include court orders that are often not directed at Google with their requests. Both types of requests are counted in this report. Google also includes government requests to review content to determine if it violates Google's product community guidelines and content policies. The data cover the period January 2010 - December 2021. Overall, the government requests to remove content increase by 11% in the second half of 2021 compared to the same period in 2020. When it comes to reasons for removal, the requests related to privacy and security increased by more than 250% and all other reasons by almost 30%. At the same time, the requests related to regulated goods fell by 20% and the ones related to copyright declined by 32%.
chart preview

Government Requests to TikTok to Remove or Restrict Content or Accounts

The chart presents the volume of government removal or restriction requests received by TikTok and the platform type of response to these requests. All requests received from governments are reviewed and acted upon based on both TikTok Community Guidelines and Terms of Service and the applicable law. The reported content will be restricted if it is illegal in a country, but it is still in line with TikTok Community Guidelines standards. The platform rejects all the requests concerning content that is not illegal and does not infringe the TikTok Community Guidelines. The data shows that in the second half of 2021, the volume of goverment requests declined by 29% compared to the previous period, but it still remains four times higher than the similar period of 2020.
chart preview

Human Detection of Illegal Content Online, by Flagging Reason

The chart shows the distribution of the videos removed by Youtube based on human detection, by flagging reason. The data represents average shares of videos removed for the period October 2017-March 2022 and are calculated based on the trimestrial values included in the transparency report. The results show that the users' main flagging reason of videos is the spam, mislinding and scam content, followed by sexual content and hateful or abusive content. When flagging a video, human flaggers can select a reason they are reporting the video and leave comments or video timestamps for YouTube's reviewers. This chart shows the flagging reasons that people selected when reporting YouTube content. A single video may be flagged multiple times and may be flagged for different reasons. Reviewers evaluate flagged videos against all of the Community Guidelines and policies, regardless of why they were originally flagged. Flagging a video does not necessarily result in it being removed. Human flagged videos are removed for violations of Community Guidelines once a trained reviewer confirms a policy violation.
chart preview

Human Flags of YouTube Videos by Type of Flagger

The chart shows the distribution of human flags on YouTube for the period October 2017 - March 2022, by type of flagger. Human flags can come from a user or a member of YouTube’s Trusted Flagger program,which include individuals, NGOs, and government agencies. The chart shows that the majority of human flags come from users, followed by individual trusted flaggers. The share of flags from NGOs is insignificant compared to the other two type of flaggers.
chart preview

Human Flags of YouTube Videos by Type of Flagger

The chart shows the distribution of human flags on YouTube for the period October 2017 - March 2022, by type of flagger. Human flags can come from a user or a member of YouTube’s Trusted Flagger program,which include individuals, NGOs and government agencies. The chart shows that the majority of human flags come from users, followed by individual trusted flaggers. The share of flags from NGOs is insignificant compared to the other two type of flaggers.
chart preview

Internet Economy as Percentage of Gross Domestic Product (2016)

This chart provides information on the share of the Internet economy within the gross domestic product for some selected countries. The data shows that Internet has created a tremendous amount of value for the economy globally, substantially impacting GDP in the selected countries.
chart preview

Internet Platform Funding Comparison: European Union and United States

This chart shows a funding gap in the European Union compared to the United States. The data refers to a 15-year time horizon, considering companies formed after 01 January 2000 up until the end of 2014. The results suggest that a US-based company, under the framework set forth by the Communications Decency Act, Section 230, is five times more likely to secure investment over $10 million and nearly 10 times more likely to receive investments over $100 million, as compared to internet companies in the European Union, under the more limited E-Commerce Directive. Therefore, the internet platform companies built under the Communications Decency Act, Section 230 regime are much more likely to receive the significant investment necessary to grow and succeed.
chart preview

Investor Concern Regarding Potential New Regulation in France (2014)

According to the chart, 90% of French investors believe the legal environment has the most negative impact on their investing activities with a significant majority of 87% concerned about investing in digital content intermediaries that are today confronted by ambiguity and uncertain outcomes, potentially large damages, and the risks of secondary liability if new anti-piracy regulations are introduced.
chart preview

Investor Concern Regarding Potential New Regulation in Germany (2014)

According to the chart, 80% of German investors believe the legal environment has the most negative impact on their investing activities with a significant majority of 90% concerned about investing in digital content intermediaries that are today confronted by ambiguity and uncertain outcomes, potentially large damages, and the risks of secondary liability if new anti-piracy regulations are introduced.
chart preview

Investor Concern Regarding Potential New Regulation in Italy (2014)

In summary, 83% of Italian investors believe the legal environment has the most negative impact on their investing activities with a significant majority of 83% concerned about investing in digital content intermediaries that are today confronted by ambiguity and uncertain outcomes, potentially large damages, and the risks of secondary liability if new anti-piracy regulations are introduced.