Explore the Charts

You can filter the charts by clicking the relevant button on the left side. You can access each chart and download the underlying data.
21 - 28 charts displayed out of 28
filtered by Sub_topic
chart preview

Share of Fake Accounts and Spam Content Actioned on Facebook (2017-2020)

The chart shows the share of fake accounts and spam content actioned on Facebook, from the fourh quarter of 2017 until the end of 2020. While these two violations remain the main reasons of removal of content on Facebook, the data shows that the other types of violations (such as adult nudity and sexual activity, child nudity and sexual exploitation, bullying and harassment, dangerous organisations, hate speech, and violent and graphic content) have also increased during this period.
chart preview

The Top Ten Countries for Hosting Child Sexual Abuse Content

The chart shows the top 10 countries that host web pages with child sexual abuse material, based on the assessment of the Internet Watch Foundation. Interestingly, seven out of 10 countries are in Europe and six out of 10 are in the European Union.
chart preview

Top-Down Versus Bottom-Up Misinformation

The chart shows that high-level politicians, celebrities, or other prominent public figures produced or spread only 20% of the misinformation in Reuters Institute's sample, but that misinformation attracted a large majority of all social media engagements in the sample. The first bar shows the share of content that was produced or shared by prominent persons in the whole sample (N=225). The second bar shows the per cent of total social media engagements of content from prominent persons out of the sub-sample of social media posts with available engagement data (N=145).
chart preview

Videos Removed by YouTube, by Removal Reason (2018-2021)

This chart shows the distribution of videos removed by YouTube, by the reason removal, over the period September 2018-March 2021. These removal reasons correspond to YouTube’s Community Guidelines. Reviewers evaluate flagged videos against all of YouTube's Community Guidelines and policies, regardless of the reason the video was originally flagged. As the chart shows, the most frequent reasons of removal of videos are child abusive content, nudity or sexual content, violent or graphic content and spam, misleading and scam content. The child abusive content removal increased by 161% compared to the same period of 2019, while spam, misleading and scam content decline in the same period by 87%.
chart preview

Videos Removed by YouTube, by Source of First Detection (2017-2021)

The chart shows the percentage of videos removed by YouTube, by first source of detection (automated flagging or human detection). Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that automated flagging is by far the first source of detection compared to human detection. As for the human detection, the biggest number of removed videos were first noticed by users, followed by individual trusted flaggers, NGOs and government agencies.
chart preview

Videos Removed by YouTube, by Source of First Detection (Human)

The chart shows the number of videos removed by YouTube, by first source of detection (human detection) for the period October 2017-March 2021. Flags from human detection can come from a user or a member of YouTube’s Trusted Flagger program. Trusted Flagger program members include individuals, NGOs, and government agencies that are particularly effective at notifying YouTube of content that violates their Community Guidelines. The chart shows that the highest number of removed videos were first noticed by users (11,413,721 videos), followed by individual trusted flaggers (4,373,064 videos), NGOs (157,051 videos) and government agencies (557 videos).
chart preview

What Happened to Reported Content (2018)

The chart shows that 45% of respondents who took action after encountering illegal content online reported that the content was taken down, but 20% reported that it was kept online without changes. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.
chart preview

What Happened to Reported Content Across European Union Member States (2018)

The chart shows that, among respondents who took action after encountering illegal content online, respondents from Hungary were the most likely to report that the content was taken down, while respondents from Estonia were the least likely to do so. The participants have answered to the question "What happened to reported content?" for which multiple answers are possible. European Union refers to EU28. The United Kingdom left the European Union on 31 January 2020.