The Tech Industry’s Growing Role in Online Content Moderation

by DeplatformApril 30th, 2025
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

The role of infrastructure providers in content moderation is evolving, with companies increasingly pressured to act as content moderators. This shift complicates the process of deplatforming controversial websites like KIWI FARMS. While some tech firms have succeeded in temporarily disrupting such platforms, the long-term effectiveness remains questionable. The growing centralization of the Internet and the political complexities surrounding online content regulation highlight the challenges in addressing harmful online communities. Future work should focus on better understanding the real-world impacts of such communities and refining legal and technical measures for more effective online content moderation.

Company Mentioned

Mention Thumbnail
featured image - The Tech Industry’s Growing Role in Online Content Moderation
Deplatform HackerNoon profile picture
0-item

Authors:

(1) Anh V. Vu, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);

(2) Alice Hutchings, University of Cambridge, Cambridge Cybercrime Centre ([email protected]);

(3) Ross Anderson, University of Cambridge, and University of Edinburgh ([email protected]).

Abstract and 1 Introduction

2. Deplatforming and the Impacts

2.1. Related Work

2.2. The Kiwi Farms Disruption

3. Methods, Datasets, and Ethics, and 3.1. Forum and Imageboard Discussions

3.2. Telegram Chats and 3.3. Web Traffic and Search Trends Analytics

3.4. Tweets Made by the Online Community and 3.5. Data Licensing

3.6. Ethical Considerations

4. The Impact on Forum Activity and Traffic, and 4.1. The Impact of Major Disruptions

4.2. Platform Displacement

4.3. Traffic Fragmentation

5. The Impacts on Relevant Stakeholders and 5.1. The Community that Started the Campaign

5.2. The Industry Responses

5.3. The Forum Operators

5.4. The Forum Members

6. Tensions, Challenges, and Implications and 6.1. The Efficacy of the Disruption

6.2. Censorship versus Free Speech

6.3. The Role of Industry in Content Moderation

6.4. Policy Implications

6.5. Limitations and Future Work

7. Conclusion, Acknowledgments, and References

Appendix A.

6.3. The Role of Industry in Content Moderation

The rapid increase of cybercrime-as-a-service throughout the 2010s makes attacks easier than ever. A teenager with as little as $10 can use a DDoS-for-hire service to knock your website offline [101], so controversial websites depend on the grace and favour of a large hosting company or a specialist DDoS prevention contractor. This is just one aspect of a broader trend in tech: that the Internet is becoming more centralised around a small number of big firms, ranging from online social platforms, hosting companies, transit networks, to service providers and exchange points [102]. While some do provide moderation tools that are favoured by content creators [103], some claim to be committed to fighting hate, harassment, and abuse yet are disproportionately responsible for serving online bad content [90], and the effort they put into the fight is variable [104], [105]. Content moderation has recently shifted to the infrastructure layer [106]; now that activists have pressured infrastructure providers to act as content moderators, policymakers will be tempted too. Some may stand up to political or social pressure, because moderation is both expensive and difficult, but others may fold from time to time because of political pressure or legal compulsion. This would undermine the end-to-end principle of the Internet, as enshrined for example in COPA s 230 in the USA and in the EU’s Net Neutrality Law [107].


Private companies must comply and remove illegal content from their infrastructure when directed to do so by a court order. However, deplatforming KIWI FARMS or any other customers does not violate the principle of free speech. It is essentially a contractual matter; they have the right to cease their support for a website that violates their policies. Infrastructure providers may occasionally need to work expediently with law enforcement in the case of an imminent threat to life. Most providers have worked out ways of doing this, but the mechanisms can be too sluggish. Cloudflare attempted to collaborate with law enforcement to sort out the case of KIWI FARMS, yet the process could not keep up with the escalating threats and it ended up taking unilateral action, relying on its terms of service [24]. In an ideal world, we would have an international legal framework for taking down websites that host illegal content or that promote crime; unfortunately, this framework does not exist.


The Budapest Convention [108] criminalises some material on which all states agree, such as child sex abuse images, but even there the boundaries are contested [109]. Online drug markets such as SILK ROAD and HANSA MARKET have been taken down because of other laws – drug laws – that also enjoy international standardisation and collaboration. Copyright infringement also gets the attention of international treaties and coordinated action by tech majors, though civil law plays a greater role here than criminal law. Then there is material about which some states feel strongly but others do not; ‘one man’s freedom fighter is another man’s terrorist’. And then there’s a vast swamp of fake news, animal cruelty, conspiracy theories, and other material that many find unpleasant or distressing, and which social networks moderate for the comfort of both their users and their advertisers. Legislators occasionally call for better policing of some of this content.

6.4. Policy Implications

Content moderation has become a political, policy, and public concern [110], [111]. The UK Online Safety Bill proposes a new regulator who will be able to apply for a court order mandating that tech firms disrupt an objectionable online activity [43]. One might imagine Ofcom deciding to take down KIWI FARMS if their target had been a resident of Britain rather than Canada, and going to the various tech firms that were involved in the disruption we describe here, serving them one after another with orders signed by a judge in the High Court in London. Even if all the companies were to comply, rather than appealing or just ignoring the court altogether, it is hard to see how such an operation could be anything like as swift, coordinated or effective as the action taken on their own initiative by tech companies that we describe here. Where the censor’s OODA loop – the process by which it can observe, orient, decide and act – involves a government agency assessing the effects of each intervention and then going to court to order the next one, the time constant would stretch from hours to months. And in any case, government interventions in this field are often significant but rather short-lived [14], [15].


One factor contributing to the resilience of KIWI FARMS is the technical competence of the forum owner. He has consistently and capably dealt with DDoS attacks on the forum, maintaining its codebase after xenForo stopped their licence, upgrading server hardware and network capability, and developing in-house DDoS protection mechanisms. Deplatforming can be more effective if the maintainer of a blatantly illegal website can be arrested and jailed (or otherwise incapacitated), as happened with SILK ROAD. With a forum like KIWI FARMS, whose operator has denounced criminal acts perpetrated via his infrastructure [55], the criminal-law option may simply not be available. The art of being a provocateur includes stopping just short of the point at which an aggressive criminal-law response would follow. This exposes the limits of civil-law remedies and voluntary action by platforms.


Previous work has also explored why governments are less able to take down bad sites than private actors [11]; that work analysed single websites with clearly illegal content, such as those hosting malware, phishing lures or sex-abuse images. This study shows why taking down an active community is likely to be even harder. Even when several tech firms roll their sleeves up and try to suppress a community some of whose members have indulged in crime and against whom there is an industry consensus, the net effect may be modest at best. Our case study may be the best result that could be expected for online censorship, but it only cut the users, posts, threads and traffic by about half. Our findings suggest that using content moderation law to suppress an unpleasant online community may be very challenging.

6.5. Limitations and Future Work

Measuring the link between physical harassment and KIWI FARMS, as well as the cost of actual harm caused by the forum members to real-world victims, would be a valuable contribution. However, we lack ground-truth data about reallife events, which cannot be solely observed from forum discussions. Investigating doxxing-related posts that share real-victim information would be a good start, but the main challenge is validating data posted by untrusted users at scale, in the absence of a robust way to identify users. Our forum data is limited in studying user migration from KIWI FARMS to its competitor LOLCOW FARM as pseudonyms are unavailable on LOLCOW FARM, so it is unclear if some KIWI FARMS members have shifted there.


Our data scrapers are running in near real time, but there is still a chance of missing messages that are posted but removed swiftly thereafter. We expect the number of such missing messages to be relatively small. More insights can be revealed from private or protected posts as people can be more extreme when posting in private. However, we choose not to analyse them due to potential harm, legality, and ethical issues. KIWI FARMS is now back online, and may well succeed in maintaining its accessibility on the clearnet. We will continue to monitor it, and extend our measurement to the more recent incidents in a follow-up report.

This paper is available on arxiv under CC BY 4.0 DEED license.


Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks