From February 15th – 28th a team of 3 monitors from the Media Diversity Institute, as part of the Get The Trolls Out! project, monitored a select few social media platforms for religious hate speech, focusing on calls for violence. This report outlines the main findings in the report, including worrying hate speech trends spotted on social media platforms.
Monitoring was conducted based on a set of key words collected and used through the Get The Trolls Out! project. Once a piece of content was found on a social media platform which went against Community Guidelines, the case was noted internally and then reported directly to the platform. The team then monitored how long it took for the platforms to take action in terms of removals.
In total, 142 cases of calls for violence on religious grounds were reported to various social media platforms over the two-week period. There was an overall removal rate of 49.30%. Overall, Twitter had the highest rate of removal at 63.90%, and YouTube had the lowest at 0%.
While this overall data shows us some interesting trends in terms of removal rates, there were some analytic observations made during the monitoring exercise which we feel show clear gaps in policy implementation on social media platforms. We found the violent hashtags were used without retribution, and also found that there was a clear antisemitic themes present on several social media platforms of people calling for “another Holocaust”. We explain these trends in more detail, and with examples,below. Please note, all the examples shown in this report have not been removed at the time of writing this report(March 2021), despite being reported. We would also like to note that for a lot of the cases which we reported on Twitter, and were subsequently removed, we did not receive a notification of removal. This lack of transparency makes the reporting process less accessible.
To read and download the full report follow this link.