Authors:
(1) Anh V. Vu, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);
(2) Alice Hutchings, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);
(3) Ross Anderson, University of Cambridge, and University of Edinburgh ([email protected]).
Table of Links
2. Deplatforming and the Impacts
2.2. The Kiwi Farms Disruption
3. Methods, Datasets, and Ethics, and 3.1. Forum and Imageboard Discussions
3.2. Telegram Chats and 3.3. Web Traffic and Search Trends Analytics
3.4. Tweets Made by the Online Community and 3.5. Data Licensing
4. The Impact on Forum Activity and Traffic, and 4.1. The Impact of Major Disruptions
5. The Impacts on Relevant Stakeholders and 5.1. The Community that Started the Campaign
6. Tensions, Challenges, and Implications and 6.1. The Efficacy of the Disruption
6.2. Censorship versus Free Speech
6.3. The Role of Industry in Content Moderation
6.5. Limitations and Future Work
7. Conclusion, Acknowledgments, and References
2. Deplatforming and the Impacts
There is a complex ecosystem of online abuse that has been evolving for decades [35], where toxic content, surveillance, and content leakage are growing threats [36], [37]. While the number of personally targeted victims is relatively low, an increasing number of individuals, including children, are being exposed to online hate speech [38]. There can be a large grey area between criminal behaviour and socially acceptable behaviour online, just as in real life. And just
as a pub landlord will throw out rowdy customers, so platforms have acceptable-use policies backed by content moderation [39], to enhance the user experience and protect advertising revenue [40].
Deplatforming refers to blocking, excluding or restricting individuals or groups from using online services, on the grounds that their activities are unlawful, or that they do not comply with the platform’s acceptable-use policy [7]. Various extremists and criminals have been exploiting online platforms for over thirty years, resulting in a complex ecosystem in which some harms are prohibited by the criminal law (such as terrorist radicalisation and child sex abuse material) while many others are blocked by platforms seeking to provide welcoming spaces for their users and advertisers. For a history and summary of current US legislative tussles and their possible side-effects, see Fishman [41]. The idea is that if a platform is used to disseminate abusive speech, removing the speech or indeed the speakers could restrict its spread, make it harder for hate groups to recruit, organise and coordinate, and ultimately protect individuals from mental and physical harm. Deplatforming can be done in various ways, ranging from limiting users’ access and restricting their activity for a time period, to suspending an account, or even stopping an entire group of users from using one or more services. For example, groups banned from major platforms can displace to other channels, whether smaller websites or messenger services [7].
Different countries draw the line between free speech and hate speech differently. For example, the USA allows the display of Nazi symbols while France and Germany do not [42]. Private firms offering ad-supported social networks generally operate much more restrictive rules, as their advertisers do not want their ads appearing alongside content that prospective customers are likely to find offensive. People wishing to generate and share such material therefore tend to congregate on smaller forums. Some argue that taking down such forums infringes on free speech and may lead to censorship of legitimate voices and dissenting opinions, especially if it is perceived as politically motivated. Others maintain that deplatforming is necessary to protect vulnerable communities from harm. Debates rage in multiple legislatures; as one example, the UK Online Safety Bill will
enable the (politically-appointed) head of Ofcom, the UK broadcast regulator, to obtain court orders to shut down online places that are considered harmful [43]. This lead us to ask: how effective might such an order be?
This paper is available on arxiv under CC BY 4.0 DEED license.