You can filter the charts by clicking the relevant button on the
left side. You can access each chart and download the underlying data.
31 - 39 charts displayed out of 39
filtered by Sub_topic
filtered by Sub_topic
Top-Down Versus Bottom-Up Misinformation
The chart shows that high-level politicians, celebrities, or other prominent public figures produced or spread only 20% of the misinformation in Reuters Institute's sample, but that misinformation attracted a large majority of all social media engagements in the sample. The first bar shows the share of content that was produced or shared by prominent persons in the whole sample (N=225). The second bar shows the per cent of total social media engagements of content from prominent persons out of the sub-sample of social media posts with available engagement data (N=145).
Trademark Content Take-Down Notices Received by TikTok
The chart presents the number of trademark take-down requests received by TikTok and the number of succesfull take-downs in 2021. The TikTok Community Guidelines and Terms of Service prohibit content that infringes third party intellectual property. When valid take-down requests based on violations of copyright law and trademark law are received by the platform from the right holders, TikTok removes the alleged infringing content in a timely manner. There is a breakdown in data for the first half of 2021 and the data for the two periods are not directly comparable. In the period January - April 2021, data includes all trademark take-down notices received by the platform, while since May 2021, the data includes only valid trademark take-down notices. Valid trademark take-down notices are notices that provide sufficient information to assess if there has been trademark infringement according to applicable law.
Videos Removed by TikTok for Policy Violations, by Type of Detection
This chart shows the volume of videos removed by Tiktok for policy violation, from July 2020 until December 2021. The data shows that the volume of videos removed by TikTok in the last quarter of 2021 increased by 86% compared to the same period of the previous year. However, the total videos removed by TikTok represents about 1% of all videos uploaded on the social media platform.
Videos Removed by TikTok for Policy Violations, by Type of Policy
This chart shows the volume of videos removed by Tiktok for policy violation, from July 2020 until December 2021. A video may violate multiple policies and each violation is reflected. In certain rare circumstances, such as emergency situations or hardware outages, a removed video’s violation category may not be captured. These videos are not represented in the chart. The data shows that minors' safety is the main reason for video removal by TikTok (45% in the last quarter of 2021), follwed by illegal activities and regulated goods (19.5%) and adult nudity and sexual activities (11%).
Videos Removed by YouTube, by Removal Reason
This chart shows the distribution of videos removed by YouTube, by the reason removal, over the period September 2018-March 2022. These removal reasons correspond to YouTube’s Community Guidelines. Reviewers evaluate flagged videos against all of YouTube's Community Guidelines and policies, regardless of the reason the video was originally flagged. As the chart shows, the most frequent reasons of removal of videos are child abusive content, violent or graphic content and nudity or sexual content. In the first quarter of 2022, the child safety content decline by 53.5% compared to the same period of 2021, while harmful or dangerous content increased in the same period by 463% and harassement and cyberbullying by 579%.
Videos Removed by YouTube, by Source of First Detection
The chart shows the percentage of videos removed by YouTube for the period October 2017-March 2022, by first source of detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that automated flagging is by far the first source of detection compared to human detection.
Videos Removed by YouTube, by Source of First Detection (Human)
The chart shows the number of videos removed by YouTube for the period October 2017-March 2022, by first source of detection (human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that the highest number of removed videos were first noticed by users (12,468,976 videos), followed by individual trusted flaggers (4,614,456 videos), NGOs (181,430 videos) and government agencies (755 videos).
What Happened to Reported Content (2018)
The chart shows that 45% of respondents who took action after encountering illegal content online reported that the content was taken down, but 20% reported that it was kept online without changes. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
What Happened to Reported Content Across European Union Member States (2018)
The chart shows that, among respondents who took action after encountering illegal content online, respondents from Hungary were the most likely to report that the content was taken down, while respondents from Estonia were the least likely to do so. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.